Apr 22 19:21:09.881141 ip-10-0-143-198 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:21:09.881156 ip-10-0-143-198 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:21:09.881181 ip-10-0-143-198 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:21:09.881496 ip-10-0-143-198 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:21:20.022239 ip-10-0-143-198 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:21:20.022258 ip-10-0-143-198 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8e419949274148928df30545ba021a94 -- Apr 22 19:23:41.692977 ip-10-0-143-198 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:42.124810 ip-10-0-143-198 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:42.124810 ip-10-0-143-198 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:42.124810 ip-10-0-143-198 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:42.124810 ip-10-0-143-198 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:42.124810 ip-10-0-143-198 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:42.126469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.126328 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:42.129332 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129317 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:42.129332 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129331 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129335 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129340 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129345 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129348 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129351 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129354 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129357 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129360 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129363 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129367 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129369 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129372 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129375 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129377 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129380 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129383 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129385 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129388 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:42.129395 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129392 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129395 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129398 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129400 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129403 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129406 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129409 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129412 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129414 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129417 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129419 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129423 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129426 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129428 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129431 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129433 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129435 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129438 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129441 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129443 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:42.129891 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129445 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129448 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129451 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129453 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129456 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129458 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129460 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129463 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129465 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129468 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129470 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129473 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129475 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129478 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129481 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129484 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129487 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129490 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129492 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:42.130364 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129495 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129498 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129500 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129503 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129534 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129538 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129541 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129544 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129547 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129549 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129552 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129554 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129557 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129561 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129563 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129566 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129568 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129571 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129573 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129576 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:42.130843 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129578 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129581 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129583 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129586 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129588 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129591 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.129593 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130643 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130649 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130652 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130655 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130659 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130663 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130666 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130669 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130673 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130676 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130679 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130681 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130685 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:42.131358 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130687 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130702 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130705 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130708 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130710 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130713 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130715 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130718 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130721 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130723 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130726 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130729 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130732 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130734 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130737 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130739 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130741 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130744 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130747 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:42.131845 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130750 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130753 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130756 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130758 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130761 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130763 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130766 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130768 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130771 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130773 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130776 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130779 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130781 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130784 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130787 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130790 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130793 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130795 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130797 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130800 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:42.132343 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130802 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130805 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130807 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130810 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130812 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130815 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130817 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130821 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130823 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130826 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130828 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130831 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130834 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130836 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130839 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130841 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130844 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130848 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130851 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130854 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:42.132837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130857 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130859 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130862 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130866 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130868 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130871 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130873 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130876 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130879 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130881 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130884 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130887 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130889 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.130892 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.130971 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.130980 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.130989 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.130995 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131000 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131003 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131007 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:42.133320 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131012 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131015 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131018 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131021 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131025 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131028 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131031 2564 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131034 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131037 2564 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131039 2564 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131042 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131045 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131050 2564 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131052 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131055 2564 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131059 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131062 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131066 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131072 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131075 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131078 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131081 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131084 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131087 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131090 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:42.133900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131093 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131098 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131100 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131103 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131106 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131109 2564 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131112 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131116 2564 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131119 2564 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131122 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131126 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131129 2564 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131132 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131135 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131138 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131141 2564 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131144 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131147 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131150 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131153 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131156 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131158 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131161 2564 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131165 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131167 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:42.134482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131172 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131176 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131179 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131182 2564 flags.go:64] FLAG: --help="false" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131184 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131187 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131190 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131193 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131197 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131200 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131203 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131205 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131209 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131211 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131214 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131217 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131220 2564 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131223 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131226 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131229 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131232 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131235 2564 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131250 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131254 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:42.135090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131257 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131263 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131266 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131269 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131272 2564 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131274 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131277 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131280 2564 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131284 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131288 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131292 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131296 2564 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131299 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131302 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131305 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131307 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131310 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131313 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131316 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131323 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131326 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131329 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131332 2564 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:42.135648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131335 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131341 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131344 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131347 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131350 2564 flags.go:64] FLAG: --port="10250" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131353 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131356 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04ac7c7dab9db1958" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131359 2564 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131363 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131366 2564 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131369 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131372 2564 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131376 2564 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131379 2564 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131382 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131385 2564 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131393 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131397 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131400 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131403 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131410 2564 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131413 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131416 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131419 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131421 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131424 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:42.136197 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131427 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131430 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131433 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131436 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131438 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131441 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131444 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131447 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131450 2564 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131453 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131458 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131461 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131463 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131468 2564 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131471 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131473 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131476 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131479 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131482 2564 flags.go:64] FLAG: --v="2" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131485 2564 flags.go:64] FLAG: --version="false" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131489 2564 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131493 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.131496 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131602 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131606 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:42.136831 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131609 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131613 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131617 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131619 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131622 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131624 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131627 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131630 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131632 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131635 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131638 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131640 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131643 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131645 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131648 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131650 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131653 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131655 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131658 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:42.137414 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131661 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131663 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131666 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131668 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131671 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131673 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131676 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131678 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131680 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131683 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131685 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131689 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131705 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131708 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131712 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131714 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131717 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131720 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131723 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131725 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:42.137915 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131728 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131730 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131733 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131735 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131738 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131740 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131743 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131745 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131748 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131750 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131753 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131756 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131758 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131761 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131763 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131766 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131768 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131770 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131773 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131775 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:42.138390 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131778 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131780 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131783 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131786 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131789 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131791 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131797 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131800 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131803 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131807 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131810 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131813 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131815 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131818 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131822 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131825 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131829 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131831 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131834 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:42.138893 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131836 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131839 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131841 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131844 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131846 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.131849 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.132453 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.138549 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.138564 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138607 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138612 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138615 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138618 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138621 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138624 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138627 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:42.139354 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138629 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138632 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138634 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138637 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138640 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138642 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138645 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138647 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138650 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138652 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138655 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138657 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138660 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138662 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138665 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138667 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138670 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138673 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138675 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138678 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:42.139760 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138680 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138683 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138686 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138704 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138712 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138717 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138721 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138724 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138727 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138730 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138732 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138735 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138737 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138740 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138742 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138745 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138747 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138750 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138752 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:42.140225 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138755 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138757 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138760 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138762 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138765 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138767 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138770 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138772 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138775 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138778 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138781 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138783 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138786 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138788 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138791 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138793 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138796 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138799 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138802 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138805 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:42.140706 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138808 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138810 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138813 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138815 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138818 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138820 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138822 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138825 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138827 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138830 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138832 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138835 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138837 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138839 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138842 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138844 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138847 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138849 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138852 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:42.141211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138854 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.138859 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138952 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138957 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138960 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138963 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138966 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138968 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138971 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138974 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138978 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138980 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138984 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138987 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138990 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:42.141656 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138992 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138995 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.138998 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139000 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139003 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139005 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139008 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139010 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139013 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139015 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139018 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139020 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139023 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139025 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139028 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139030 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139033 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139035 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139038 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139041 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:42.142033 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139043 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139046 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139048 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139051 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139053 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139055 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139058 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139061 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139063 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139065 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139068 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139071 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139073 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139076 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139078 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139080 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139083 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139085 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139088 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139090 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:42.142503 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139092 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139095 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139097 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139100 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139103 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139105 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139107 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139110 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139113 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139115 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139117 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139120 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139122 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139125 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139127 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139130 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139132 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139134 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139137 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:42.142995 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139140 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139143 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139147 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139149 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139152 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139155 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139158 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139160 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139162 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139165 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139167 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139169 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139172 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:42.139174 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.139179 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:42.143450 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.139958 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:42.143819 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.142492 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:42.143819 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.143365 2564 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:42.143819 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.143456 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:42.144575 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.144562 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:42.168843 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.168828 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:42.174514 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.174486 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:42.192347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.192328 2564 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:42.197411 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.197390 2564 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:42.198331 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.198306 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:42.198507 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.198492 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:42.203244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.203220 2564 fs.go:135] Filesystem UUIDs: map[211a4247-7191-45dd-a0ed-1d646f29a6da:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a28cdaf2-36b7-4062-9777-5f8c6a379a3a:/dev/nvme0n1p4] Apr 22 19:23:42.203299 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.203244 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:42.208766 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.208650 2564 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:42.206838847 +0000 UTC m=+0.401033425 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3194070 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec230a48d61ae36a99bd30d5cb9ec20d SystemUUID:ec230a48-d61a-e36a-99bd-30d5cb9ec20d BootID:8e419949-2741-4892-8df3-0545ba021a94 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b1:8a:5d:8e:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b1:8a:5d:8e:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:44:9f:fb:fc:37 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:42.208766 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.208761 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:42.208868 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.208841 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:42.209882 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.209856 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:42.210055 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.209885 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-198.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:42.210097 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.210064 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:42.210097 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.210073 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:42.210097 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.210089 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:42.210909 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.210899 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:42.211630 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.211620 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:42.211744 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.211735 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:42.214024 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.214014 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:42.214067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.214028 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:42.214067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.214040 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:42.214067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.214050 2564 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:42.214067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.214064 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:42.215050 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.215038 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:42.215100 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.215055 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:42.218315 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.218295 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:42.219921 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.219904 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:42.221702 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221676 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221709 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221716 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221727 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221733 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221739 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221745 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221750 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221759 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:42.221767 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221765 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:42.222000 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221779 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:42.222000 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.221791 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:42.222826 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.222815 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:42.222870 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.222832 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:42.224346 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.224321 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:42.224414 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.224335 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:42.225854 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.225825 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-198.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:42.228146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.228131 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:42.228220 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.228176 2564 server.go:1295] "Started kubelet" Apr 22 19:23:42.228286 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.228260 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:42.228367 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.228330 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:42.228418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.228384 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:42.228974 ip-10-0-143-198 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:42.229288 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.229273 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:42.230825 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.230812 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:42.233874 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.233858 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:42.233874 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.233870 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:42.234645 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234616 2564 factory.go:55] Registering systemd factory Apr 22 19:23:42.234645 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234639 2564 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:42.234799 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234658 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:42.234799 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234708 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:42.234799 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234727 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:42.234799 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234800 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:42.235008 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.234810 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:42.235080 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.235016 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.235152 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.235133 2564 factory.go:153] Registering CRI-O factory Apr 22 19:23:42.235223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.235214 2564 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:42.235332 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.235316 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:42.235399 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.235346 2564 factory.go:103] Registering Raw factory Apr 22 19:23:42.235399 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.235361 2564 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:42.235872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.235858 2564 manager.go:319] Starting recovery of all containers Apr 22 19:23:42.236874 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.235426 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-198.ec2.internal.18a8c43db0562000 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-198.ec2.internal,UID:ip-10-0-143-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-198.ec2.internal,},FirstTimestamp:2026-04-22 19:23:42.228144128 +0000 UTC m=+0.422338713,LastTimestamp:2026-04-22 19:23:42.228144128 +0000 UTC m=+0.422338713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-198.ec2.internal,}" Apr 22 19:23:42.237087 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.237064 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:42.240334 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.240314 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8m9s9" Apr 22 19:23:42.242346 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.242321 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:42.242513 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.242483 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:42.245447 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.245423 2564 manager.go:324] Recovery completed Apr 22 19:23:42.250448 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.250329 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8m9s9" Apr 22 19:23:42.250581 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.250569 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:42.253777 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.253762 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:42.253842 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.253791 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:42.253842 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.253802 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:42.254253 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.254239 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:42.254253 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.254250 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:42.254382 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.254267 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:42.255421 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.255360 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-198.ec2.internal.18a8c43db1dd3aa2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-198.ec2.internal,UID:ip-10-0-143-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-198.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-198.ec2.internal,},FirstTimestamp:2026-04-22 19:23:42.253775522 +0000 UTC m=+0.447970101,LastTimestamp:2026-04-22 19:23:42.253775522 +0000 UTC m=+0.447970101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-198.ec2.internal,}" Apr 22 19:23:42.256481 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.256468 2564 policy_none.go:49] "None policy: Start" Apr 22 19:23:42.256563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.256486 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:42.256563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.256499 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:42.291246 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291230 2564 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.291263 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291276 2564 server.go:85] "Starting device plugin registration server" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291682 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291713 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291830 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291943 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.291953 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.292423 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:42.307347 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.292458 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.361676 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.361640 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:42.362823 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.362807 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:42.362915 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.362836 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:42.362915 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.362869 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:42.362915 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.362879 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:42.363039 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.362920 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:42.366222 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.366200 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:42.392460 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.392404 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:42.393549 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.393519 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:42.393641 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.393555 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:42.393641 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.393568 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:42.393641 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.393596 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.402283 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.402268 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.402344 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.402290 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-198.ec2.internal\": node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.419827 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.419802 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.463584 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.463557 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal"] Apr 22 19:23:42.463679 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.463623 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:42.464470 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.464456 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:42.464538 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.464482 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:42.464538 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.464494 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:42.466557 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.466544 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:42.466721 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.466707 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.466787 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.466739 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:42.467242 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.467221 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:42.467311 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.467229 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:42.467311 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.467272 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:42.467311 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.467284 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:42.467311 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.467251 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:42.467435 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.467312 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:42.469151 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.469136 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.469217 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.469158 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:42.469844 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.469828 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:42.469918 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.469854 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:42.469918 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.469865 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:42.498323 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.498300 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-198.ec2.internal\" not found" node="ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.502362 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.502345 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-198.ec2.internal\" not found" node="ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.520876 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.520853 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.536954 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.536933 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9710bcde2b9428f4968b64f229b324a2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal\" (UID: \"9710bcde2b9428f4968b64f229b324a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.537017 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.536960 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9710bcde2b9428f4968b64f229b324a2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal\" (UID: \"9710bcde2b9428f4968b64f229b324a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.537017 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.536976 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/42a9abee4bf02f33d29bcb7ba1a804d2-config\") pod \"kube-apiserver-proxy-ip-10-0-143-198.ec2.internal\" (UID: \"42a9abee4bf02f33d29bcb7ba1a804d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.621569 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.621531 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.637901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.637875 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9710bcde2b9428f4968b64f229b324a2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal\" (UID: \"9710bcde2b9428f4968b64f229b324a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.637964 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.637907 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9710bcde2b9428f4968b64f229b324a2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal\" (UID: \"9710bcde2b9428f4968b64f229b324a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.637964 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.637925 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/42a9abee4bf02f33d29bcb7ba1a804d2-config\") pod \"kube-apiserver-proxy-ip-10-0-143-198.ec2.internal\" (UID: \"42a9abee4bf02f33d29bcb7ba1a804d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.638032 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.637972 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9710bcde2b9428f4968b64f229b324a2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal\" (UID: \"9710bcde2b9428f4968b64f229b324a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.638032 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.637971 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9710bcde2b9428f4968b64f229b324a2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal\" (UID: \"9710bcde2b9428f4968b64f229b324a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.638032 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.638002 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/42a9abee4bf02f33d29bcb7ba1a804d2-config\") pod \"kube-apiserver-proxy-ip-10-0-143-198.ec2.internal\" (UID: \"42a9abee4bf02f33d29bcb7ba1a804d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.722369 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.722300 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.799875 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.799850 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.804441 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:42.804421 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" Apr 22 19:23:42.823232 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.823209 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:42.923807 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:42.923771 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:43.024314 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:43.024238 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:43.070470 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.070441 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:43.115036 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.115014 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:43.125289 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:43.125268 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:43.142763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.142735 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:43.142865 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.142850 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:43.142919 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.142879 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:43.142919 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.142909 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:43.226243 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:43.226214 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:43.234598 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.234575 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:43.244024 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.244003 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:43.252157 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.252134 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:42 +0000 UTC" deadline="2027-11-27 06:28:23.169873351 +0000 UTC" Apr 22 19:23:43.252212 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.252157 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14003h4m39.917719487s" Apr 22 19:23:43.326514 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:43.326453 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:43.346585 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.346563 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5fg56" Apr 22 19:23:43.354959 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.354941 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5fg56" Apr 22 19:23:43.384431 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:43.384401 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9710bcde2b9428f4968b64f229b324a2.slice/crio-0013222338da10b34578f9d907709de542cc1fef143a725c77d92fa66c8af632 WatchSource:0}: Error finding container 0013222338da10b34578f9d907709de542cc1fef143a725c77d92fa66c8af632: Status 404 returned error can't find the container with id 0013222338da10b34578f9d907709de542cc1fef143a725c77d92fa66c8af632 Apr 22 19:23:43.384677 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:43.384653 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a9abee4bf02f33d29bcb7ba1a804d2.slice/crio-5bcc8157e621f1a504b62c266e9e63f5bbdd8365708c4a72d2684f0a8f0c9198 WatchSource:0}: Error finding container 5bcc8157e621f1a504b62c266e9e63f5bbdd8365708c4a72d2684f0a8f0c9198: Status 404 returned error can't find the container with id 5bcc8157e621f1a504b62c266e9e63f5bbdd8365708c4a72d2684f0a8f0c9198 Apr 22 19:23:43.388848 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.388836 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:43.427439 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:43.427413 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-198.ec2.internal\" not found" Apr 22 19:23:43.473576 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.473553 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:43.534817 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.534798 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" Apr 22 19:23:43.546954 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.546938 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:43.547778 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.547766 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" Apr 22 19:23:43.557111 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:43.557094 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:44.215239 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.215210 2564 apiserver.go:52] "Watching apiserver" Apr 22 19:23:44.226715 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.226677 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:44.227049 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.227026 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-88jfr","kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt","openshift-cluster-node-tuning-operator/tuned-hvwp6","openshift-dns/node-resolver-7dhdx","openshift-image-registry/node-ca-v22lp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal","openshift-multus/multus-2222w","openshift-multus/multus-additional-cni-plugins-8vbtk","openshift-multus/network-metrics-daemon-jblt6","openshift-network-diagnostics/network-check-target-m4bzg","openshift-network-operator/iptables-alerter-rfw29","openshift-ovn-kubernetes/ovnkube-node-vrxcm"] Apr 22 19:23:44.229679 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.229657 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.229777 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.229754 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:44.231847 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.231824 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.234025 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.234003 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.234328 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.234309 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.234408 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.234390 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.235088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.234902 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xgnxp\"" Apr 22 19:23:44.235088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.234922 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:44.236183 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.236169 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.236541 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.236523 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.238485 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.238471 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.240242 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.240226 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.240416 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.240402 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b5jlt\"" Apr 22 19:23:44.240566 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.240548 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.240638 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.240626 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qgw28\"" Apr 22 19:23:44.240708 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.240681 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.240784 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.240767 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2222w" Apr 22 19:23:44.242911 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.242891 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.244996 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.244980 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.247192 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247166 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-tmp\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247193 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.247279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247170 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:44.247279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247212 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-device-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.247279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247235 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-sys-fs\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.247279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247258 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38bec921-89a6-4a82-b51d-20431c5dedc1-hosts-file\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.247279 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.247255 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247292 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-etc-selinux\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247325 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysctl-conf\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247352 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-sys\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247376 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-tuned\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247391 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8njd\" (UniqueName: \"kubernetes.io/projected/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-kube-api-access-q8njd\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247407 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247423 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-registration-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247438 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysctl-d\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247451 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-lib-modules\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-kubernetes\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247489 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-var-lib-kubelet\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247503 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-host\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.247563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247522 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvmq9\" (UniqueName: \"kubernetes.io/projected/38bec921-89a6-4a82-b51d-20431c5dedc1-kube-api-access-hvmq9\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247583 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-socket-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247607 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-modprobe-d\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247626 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysconfig\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247645 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38bec921-89a6-4a82-b51d-20431c5dedc1-tmp-dir\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247670 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xbp4\" (UniqueName: \"kubernetes.io/projected/1987be77-a025-44d0-b506-5a4bb7b2c605-kube-api-access-5xbp4\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247735 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdfh5\" (UniqueName: \"kubernetes.io/projected/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-kube-api-access-cdfh5\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247771 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-systemd\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247782 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.247796 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-run\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.248011 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.248088 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:44.248142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.248103 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hkfdv\"" Apr 22 19:23:44.249293 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.249277 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:44.249367 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.249318 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rx9ps\"" Apr 22 19:23:44.249478 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.249460 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.250104 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250090 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.250367 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250348 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.250452 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250437 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.250512 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250448 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:44.250592 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250437 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-26kft\"" Apr 22 19:23:44.250742 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250725 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.250809 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.250754 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:44.251172 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.251113 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:44.251474 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.251454 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d7znh\"" Apr 22 19:23:44.252257 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.252063 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.252515 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.252447 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gsd26\"" Apr 22 19:23:44.252515 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.252479 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.252687 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.252665 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.253510 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.253491 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:44.256047 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.255986 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:44.257430 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.257407 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gfcjx\"" Apr 22 19:23:44.257523 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.257437 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:44.257523 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.257447 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:44.257523 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.257508 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:44.257651 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.257617 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:44.257740 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.257726 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:44.335930 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.335903 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:44.348556 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348529 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-device-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348571 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-k8s-cni-cncf-io\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348596 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-daemon-config\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348619 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-multus-certs\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348641 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60db5cd9-d42e-4ebb-b880-d777700e74ea-agent-certs\") pod \"konnectivity-agent-88jfr\" (UID: \"60db5cd9-d42e-4ebb-b880-d777700e74ea\") " pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348667 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38bec921-89a6-4a82-b51d-20431c5dedc1-hosts-file\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348667 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-device-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.348689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348688 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-sys\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348732 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-tuned\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348759 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-env-overrides\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.349019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348779 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38bec921-89a6-4a82-b51d-20431c5dedc1-hosts-file\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.349019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348783 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5wr\" (UniqueName: \"kubernetes.io/projected/41bc5667-8d3d-482d-9edb-6340167eb814-kube-api-access-tx5wr\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.349019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-sys\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349138 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:44.349390 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.348902 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp9m7\" (UniqueName: \"kubernetes.io/projected/d46c700e-0847-4daf-bd26-4b29c5bee728-kube-api-access-dp9m7\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.349432 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349413 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-os-release\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.349481 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349439 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-kubelet\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.349481 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349465 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.349573 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349490 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60db5cd9-d42e-4ebb-b880-d777700e74ea-konnectivity-ca\") pod \"konnectivity-agent-88jfr\" (UID: \"60db5cd9-d42e-4ebb-b880-d777700e74ea\") " pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.349573 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysctl-d\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349573 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349544 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-lib-modules\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349573 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349569 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-conf-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.349743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349596 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9ct\" (UniqueName: \"kubernetes.io/projected/5092e3d5-3682-4a0c-bf3d-5313cc838278-kube-api-access-9n9ct\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.349743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349618 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-systemd-units\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.349743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349640 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-etc-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.349743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349667 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysctl-d\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349670 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-lib-modules\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.349743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349685 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-os-release\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349740 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349773 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdfh5\" (UniqueName: \"kubernetes.io/projected/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-kube-api-access-cdfh5\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-cni-bin\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349821 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-hostroot\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349845 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-kubernetes\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349868 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-var-lib-kubelet\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349895 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-host\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349919 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-socket-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349941 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-modprobe-d\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349944 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-var-lib-kubelet\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349965 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38bec921-89a6-4a82-b51d-20431c5dedc1-tmp-dir\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.350012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.349999 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-kubernetes\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350041 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350085 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-ovn\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350108 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-host\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350133 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350158 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d46c700e-0847-4daf-bd26-4b29c5bee728-iptables-alerter-script\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-host\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350177 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-socket-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350188 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-modprobe-d\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-socket-dir-parent\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-cni-multus\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350312 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38bec921-89a6-4a82-b51d-20431c5dedc1-tmp-dir\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350311 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-cnibin\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350379 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-systemd\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350415 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-systemd\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350413 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-run\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350462 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-run\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.350483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350467 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-netns\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350495 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90687231-1b2c-4845-9e57-cab76563d259-ovn-node-metrics-cert\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350536 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-ovnkube-script-lib\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350559 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-tmp\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350594 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350644 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350662 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-sys-fs\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350705 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-cnibin\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350732 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5092e3d5-3682-4a0c-bf3d-5313cc838278-cni-binary-copy\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350757 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-var-lib-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350757 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-sys-fs\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350784 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-system-cni-dir\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350830 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-etc-selinux\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350862 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysctl-conf\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350917 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-kubelet\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350957 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysctl-conf\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.350969 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-etc-selinux\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.351186 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351088 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351118 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-serviceca\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351146 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8njd\" (UniqueName: \"kubernetes.io/projected/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-kube-api-access-q8njd\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351198 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-ovnkube-config\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351237 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351315 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-registration-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351349 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-cni-bin\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351371 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-cni-netd\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.351381 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351395 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d46c700e-0847-4daf-bd26-4b29c5bee728-host-slash\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351418 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-slash\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.351462 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:44.851419371 +0000 UTC m=+3.045613956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351442 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1987be77-a025-44d0-b506-5a4bb7b2c605-registration-dir\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351514 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-systemd\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351558 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvmq9\" (UniqueName: \"kubernetes.io/projected/38bec921-89a6-4a82-b51d-20431c5dedc1-kube-api-access-hvmq9\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.352019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351583 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysconfig\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351603 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-node-log\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351623 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-run-ovn-kubernetes\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xbp4\" (UniqueName: \"kubernetes.io/projected/1987be77-a025-44d0-b506-5a4bb7b2c605-kube-api-access-5xbp4\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351671 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-cni-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351710 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-cni-binary-copy\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351733 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-etc-kubernetes\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351745 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-sysconfig\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351754 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-run-netns\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351803 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqbw\" (UniqueName: \"kubernetes.io/projected/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-kube-api-access-4qqbw\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351829 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-system-cni-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351853 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-log-socket\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.352763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.351875 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c4lc\" (UniqueName: \"kubernetes.io/projected/90687231-1b2c-4845-9e57-cab76563d259-kube-api-access-4c4lc\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.353146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.352841 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-etc-tuned\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.353146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.352908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-tmp\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.355672 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.355647 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:43 +0000 UTC" deadline="2027-12-09 06:05:56.769312361 +0000 UTC" Apr 22 19:23:44.355672 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.355671 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14290h42m12.413644266s" Apr 22 19:23:44.358903 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.358870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdfh5\" (UniqueName: \"kubernetes.io/projected/fb1d2f05-e6fc-4b8c-a646-fdebb0847854-kube-api-access-cdfh5\") pod \"tuned-hvwp6\" (UID: \"fb1d2f05-e6fc-4b8c-a646-fdebb0847854\") " pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.360372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.360344 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvmq9\" (UniqueName: \"kubernetes.io/projected/38bec921-89a6-4a82-b51d-20431c5dedc1-kube-api-access-hvmq9\") pod \"node-resolver-7dhdx\" (UID: \"38bec921-89a6-4a82-b51d-20431c5dedc1\") " pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.360550 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.360529 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xbp4\" (UniqueName: \"kubernetes.io/projected/1987be77-a025-44d0-b506-5a4bb7b2c605-kube-api-access-5xbp4\") pod \"aws-ebs-csi-driver-node-k8tpt\" (UID: \"1987be77-a025-44d0-b506-5a4bb7b2c605\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.361039 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.361021 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8njd\" (UniqueName: \"kubernetes.io/projected/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-kube-api-access-q8njd\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.369372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.369328 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" event={"ID":"9710bcde2b9428f4968b64f229b324a2","Type":"ContainerStarted","Data":"0013222338da10b34578f9d907709de542cc1fef143a725c77d92fa66c8af632"} Apr 22 19:23:44.370260 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.370236 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" event={"ID":"42a9abee4bf02f33d29bcb7ba1a804d2","Type":"ContainerStarted","Data":"5bcc8157e621f1a504b62c266e9e63f5bbdd8365708c4a72d2684f0a8f0c9198"} Apr 22 19:23:44.452967 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.452939 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-cnibin\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.452977 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5092e3d5-3682-4a0c-bf3d-5313cc838278-cni-binary-copy\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453001 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-var-lib-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453023 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-system-cni-dir\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453040 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-kubelet\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453063 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453066 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-cnibin\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453091 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-serviceca\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-var-lib-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453120 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-kubelet\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453115 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-ovnkube-config\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453162 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-system-cni-dir\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453164 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453208 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-cni-bin\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453241 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-cni-netd\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453266 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d46c700e-0847-4daf-bd26-4b29c5bee728-host-slash\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453291 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-slash\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453315 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-systemd\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453342 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-node-log\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453366 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-run-ovn-kubernetes\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-cni-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453414 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-cni-binary-copy\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453438 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-etc-kubernetes\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453241 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-cni-bin\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453463 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-run-netns\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453463 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-slash\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.453624 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453498 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d46c700e-0847-4daf-bd26-4b29c5bee728-host-slash\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453525 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-run-netns\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453534 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-cni-netd\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453565 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-run-ovn-kubernetes\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453602 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-systemd\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453634 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-node-log\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453661 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqbw\" (UniqueName: \"kubernetes.io/projected/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-kube-api-access-4qqbw\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453687 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-system-cni-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453685 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-ovnkube-config\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-log-socket\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453765 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-log-socket\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453775 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c4lc\" (UniqueName: \"kubernetes.io/projected/90687231-1b2c-4845-9e57-cab76563d259-kube-api-access-4c4lc\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453805 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-k8s-cni-cncf-io\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453816 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-cni-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453837 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-daemon-config\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453864 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-multus-certs\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453815 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-system-cni-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-k8s-cni-cncf-io\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.454412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453740 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-etc-kubernetes\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453918 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-multus-certs\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60db5cd9-d42e-4ebb-b880-d777700e74ea-agent-certs\") pod \"konnectivity-agent-88jfr\" (UID: \"60db5cd9-d42e-4ebb-b880-d777700e74ea\") " pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.453975 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-env-overrides\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454003 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5wr\" (UniqueName: \"kubernetes.io/projected/41bc5667-8d3d-482d-9edb-6340167eb814-kube-api-access-tx5wr\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454015 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-serviceca\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454030 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dp9m7\" (UniqueName: \"kubernetes.io/projected/d46c700e-0847-4daf-bd26-4b29c5bee728-kube-api-access-dp9m7\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-os-release\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454068 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-kubelet\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454088 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454112 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60db5cd9-d42e-4ebb-b880-d777700e74ea-konnectivity-ca\") pod \"konnectivity-agent-88jfr\" (UID: \"60db5cd9-d42e-4ebb-b880-d777700e74ea\") " pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454131 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-conf-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454147 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9ct\" (UniqueName: \"kubernetes.io/projected/5092e3d5-3682-4a0c-bf3d-5313cc838278-kube-api-access-9n9ct\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-systemd-units\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454190 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-etc-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454206 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-os-release\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454220 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.455207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454236 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-cni-bin\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-hostroot\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454318 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-host-kubelet\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454325 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-daemon-config\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454337 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-etc-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454369 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-ovn\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454392 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-env-overrides\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454399 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-openvswitch\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454401 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-systemd-units\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454430 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-os-release\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454436 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-conf-dir\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454441 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-hostroot\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454432 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-cni-bin\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454456 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-host\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454537 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454582 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-os-release\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.455901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454609 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90687231-1b2c-4845-9e57-cab76563d259-run-ovn\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d46c700e-0847-4daf-bd26-4b29c5bee728-iptables-alerter-script\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454645 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-socket-dir-parent\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454669 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-cni-multus\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454721 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-cnibin\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454750 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-netns\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454773 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60db5cd9-d42e-4ebb-b880-d777700e74ea-konnectivity-ca\") pod \"konnectivity-agent-88jfr\" (UID: \"60db5cd9-d42e-4ebb-b880-d777700e74ea\") " pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454793 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-host\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454824 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41bc5667-8d3d-482d-9edb-6340167eb814-cnibin\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454831 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-var-lib-cni-multus\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454858 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90687231-1b2c-4845-9e57-cab76563d259-ovn-node-metrics-cert\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454869 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-host-run-netns\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454905 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5092e3d5-3682-4a0c-bf3d-5313cc838278-multus-socket-dir-parent\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454918 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-ovnkube-script-lib\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.454988 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.455049 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-cni-binary-copy\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.455165 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5092e3d5-3682-4a0c-bf3d-5313cc838278-cni-binary-copy\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.456644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.455171 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d46c700e-0847-4daf-bd26-4b29c5bee728-iptables-alerter-script\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.457343 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.455521 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41bc5667-8d3d-482d-9edb-6340167eb814-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.457343 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.455523 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90687231-1b2c-4845-9e57-cab76563d259-ovnkube-script-lib\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.457343 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.456651 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60db5cd9-d42e-4ebb-b880-d777700e74ea-agent-certs\") pod \"konnectivity-agent-88jfr\" (UID: \"60db5cd9-d42e-4ebb-b880-d777700e74ea\") " pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.457343 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.457272 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90687231-1b2c-4845-9e57-cab76563d259-ovn-node-metrics-cert\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.463647 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.463622 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqbw\" (UniqueName: \"kubernetes.io/projected/92374958-e1ad-48ca-bdd6-3c9a98c2e9e3-kube-api-access-4qqbw\") pod \"node-ca-v22lp\" (UID: \"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3\") " pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.466745 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.466589 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:44.466745 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.466612 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:44.466745 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.466626 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:44.466745 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.466713 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:44.966681278 +0000 UTC m=+3.160875847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:44.467372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.467348 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c4lc\" (UniqueName: \"kubernetes.io/projected/90687231-1b2c-4845-9e57-cab76563d259-kube-api-access-4c4lc\") pod \"ovnkube-node-vrxcm\" (UID: \"90687231-1b2c-4845-9e57-cab76563d259\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.467488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.467445 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp9m7\" (UniqueName: \"kubernetes.io/projected/d46c700e-0847-4daf-bd26-4b29c5bee728-kube-api-access-dp9m7\") pod \"iptables-alerter-rfw29\" (UID: \"d46c700e-0847-4daf-bd26-4b29c5bee728\") " pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.468993 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.468975 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9ct\" (UniqueName: \"kubernetes.io/projected/5092e3d5-3682-4a0c-bf3d-5313cc838278-kube-api-access-9n9ct\") pod \"multus-2222w\" (UID: \"5092e3d5-3682-4a0c-bf3d-5313cc838278\") " pod="openshift-multus/multus-2222w" Apr 22 19:23:44.469082 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.469012 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5wr\" (UniqueName: \"kubernetes.io/projected/41bc5667-8d3d-482d-9edb-6340167eb814-kube-api-access-tx5wr\") pod \"multus-additional-cni-plugins-8vbtk\" (UID: \"41bc5667-8d3d-482d-9edb-6340167eb814\") " pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.550013 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.549977 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" Apr 22 19:23:44.557423 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.557398 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:44.560772 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.560748 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" Apr 22 19:23:44.567338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.567318 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7dhdx" Apr 22 19:23:44.572906 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.572888 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v22lp" Apr 22 19:23:44.579479 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.579457 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2222w" Apr 22 19:23:44.587045 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.587023 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" Apr 22 19:23:44.593592 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.593573 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:23:44.601130 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.601111 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rfw29" Apr 22 19:23:44.607731 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.607713 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:23:44.858526 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:44.858459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:44.858646 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.858568 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:44.858646 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:44.858622 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:45.858608037 +0000 UTC m=+4.052802602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:45.010262 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.010231 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1d2f05_e6fc_4b8c_a646_fdebb0847854.slice/crio-e626a43a0c3dc4ce7163640a27d2183fa55affb0121f41fcdcc7b65f7c064645 WatchSource:0}: Error finding container e626a43a0c3dc4ce7163640a27d2183fa55affb0121f41fcdcc7b65f7c064645: Status 404 returned error can't find the container with id e626a43a0c3dc4ce7163640a27d2183fa55affb0121f41fcdcc7b65f7c064645 Apr 22 19:23:45.011427 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.011374 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38bec921_89a6_4a82_b51d_20431c5dedc1.slice/crio-725ce38d4e333df1f8d19e125bc15cb00796ee4752d4266e0ad1742cd778b2c6 WatchSource:0}: Error finding container 725ce38d4e333df1f8d19e125bc15cb00796ee4752d4266e0ad1742cd778b2c6: Status 404 returned error can't find the container with id 725ce38d4e333df1f8d19e125bc15cb00796ee4752d4266e0ad1742cd778b2c6 Apr 22 19:23:45.012085 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.012060 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92374958_e1ad_48ca_bdd6_3c9a98c2e9e3.slice/crio-bbca4e598eaed955433e985113faecedf81abe7d81a28fcab13aa7aa852b79fc WatchSource:0}: Error finding container bbca4e598eaed955433e985113faecedf81abe7d81a28fcab13aa7aa852b79fc: Status 404 returned error can't find the container with id bbca4e598eaed955433e985113faecedf81abe7d81a28fcab13aa7aa852b79fc Apr 22 19:23:45.013019 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.012995 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46c700e_0847_4daf_bd26_4b29c5bee728.slice/crio-540fc9c2596467f0cfb513f1d02e3011b267a2f83ff63120ba5fc165afcc15a7 WatchSource:0}: Error finding container 540fc9c2596467f0cfb513f1d02e3011b267a2f83ff63120ba5fc165afcc15a7: Status 404 returned error can't find the container with id 540fc9c2596467f0cfb513f1d02e3011b267a2f83ff63120ba5fc165afcc15a7 Apr 22 19:23:45.013777 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.013754 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5092e3d5_3682_4a0c_bf3d_5313cc838278.slice/crio-084668b659fbcc56d8ad1639756de702a2602ae4e7fbdea58de4378e6c9a66b4 WatchSource:0}: Error finding container 084668b659fbcc56d8ad1639756de702a2602ae4e7fbdea58de4378e6c9a66b4: Status 404 returned error can't find the container with id 084668b659fbcc56d8ad1639756de702a2602ae4e7fbdea58de4378e6c9a66b4 Apr 22 19:23:45.016484 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.016466 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bc5667_8d3d_482d_9edb_6340167eb814.slice/crio-07f3f20f44bab8b1ceb178488e9615eca14a8e7330e328008fe6bc246b3c6011 WatchSource:0}: Error finding container 07f3f20f44bab8b1ceb178488e9615eca14a8e7330e328008fe6bc246b3c6011: Status 404 returned error can't find the container with id 07f3f20f44bab8b1ceb178488e9615eca14a8e7330e328008fe6bc246b3c6011 Apr 22 19:23:45.037864 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.037842 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1987be77_a025_44d0_b506_5a4bb7b2c605.slice/crio-7075eb6da3292c96c780427dbbc29fd913c92aec4f8576484b9a6761b3e9411e WatchSource:0}: Error finding container 7075eb6da3292c96c780427dbbc29fd913c92aec4f8576484b9a6761b3e9411e: Status 404 returned error can't find the container with id 7075eb6da3292c96c780427dbbc29fd913c92aec4f8576484b9a6761b3e9411e Apr 22 19:23:45.038610 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.038569 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60db5cd9_d42e_4ebb_b880_d777700e74ea.slice/crio-35580a1d7ade958e13a9c504c5c9aac198dac6dce59a9522657a131b2c23d252 WatchSource:0}: Error finding container 35580a1d7ade958e13a9c504c5c9aac198dac6dce59a9522657a131b2c23d252: Status 404 returned error can't find the container with id 35580a1d7ade958e13a9c504c5c9aac198dac6dce59a9522657a131b2c23d252 Apr 22 19:23:45.039370 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:23:45.039352 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90687231_1b2c_4845_9e57_cab76563d259.slice/crio-0ebd8ed3eccfc4844b7f6e6a96948e183595e1fd4f3523ad0cd5dd24eb731d0b WatchSource:0}: Error finding container 0ebd8ed3eccfc4844b7f6e6a96948e183595e1fd4f3523ad0cd5dd24eb731d0b: Status 404 returned error can't find the container with id 0ebd8ed3eccfc4844b7f6e6a96948e183595e1fd4f3523ad0cd5dd24eb731d0b Apr 22 19:23:45.059681 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.059653 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:45.059816 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.059797 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:45.059880 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.059819 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:45.059880 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.059828 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:45.059880 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.059869 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.059856712 +0000 UTC m=+4.254051277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:45.069201 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.069168 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:45.355861 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.355807 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:43 +0000 UTC" deadline="2028-01-09 04:54:18.28023932 +0000 UTC" Apr 22 19:23:45.355861 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.355840 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15033h30m32.924403098s" Apr 22 19:23:45.363780 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.363656 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:45.363894 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.363803 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:45.373414 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.373359 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-88jfr" event={"ID":"60db5cd9-d42e-4ebb-b880-d777700e74ea","Type":"ContainerStarted","Data":"35580a1d7ade958e13a9c504c5c9aac198dac6dce59a9522657a131b2c23d252"} Apr 22 19:23:45.375966 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.375941 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" event={"ID":"1987be77-a025-44d0-b506-5a4bb7b2c605","Type":"ContainerStarted","Data":"7075eb6da3292c96c780427dbbc29fd913c92aec4f8576484b9a6761b3e9411e"} Apr 22 19:23:45.377176 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.377151 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v22lp" event={"ID":"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3","Type":"ContainerStarted","Data":"bbca4e598eaed955433e985113faecedf81abe7d81a28fcab13aa7aa852b79fc"} Apr 22 19:23:45.379913 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.379878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2222w" event={"ID":"5092e3d5-3682-4a0c-bf3d-5313cc838278","Type":"ContainerStarted","Data":"084668b659fbcc56d8ad1639756de702a2602ae4e7fbdea58de4378e6c9a66b4"} Apr 22 19:23:45.383214 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.382963 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" event={"ID":"42a9abee4bf02f33d29bcb7ba1a804d2","Type":"ContainerStarted","Data":"04b8780436600efc847c6a596505adb210321a0a30b2dfedbd3440b583eb76fa"} Apr 22 19:23:45.386203 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.385346 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"0ebd8ed3eccfc4844b7f6e6a96948e183595e1fd4f3523ad0cd5dd24eb731d0b"} Apr 22 19:23:45.387498 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.387450 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerStarted","Data":"07f3f20f44bab8b1ceb178488e9615eca14a8e7330e328008fe6bc246b3c6011"} Apr 22 19:23:45.393653 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.393624 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfw29" event={"ID":"d46c700e-0847-4daf-bd26-4b29c5bee728","Type":"ContainerStarted","Data":"540fc9c2596467f0cfb513f1d02e3011b267a2f83ff63120ba5fc165afcc15a7"} Apr 22 19:23:45.398729 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.396870 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7dhdx" event={"ID":"38bec921-89a6-4a82-b51d-20431c5dedc1","Type":"ContainerStarted","Data":"725ce38d4e333df1f8d19e125bc15cb00796ee4752d4266e0ad1742cd778b2c6"} Apr 22 19:23:45.399857 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.399827 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" event={"ID":"fb1d2f05-e6fc-4b8c-a646-fdebb0847854","Type":"ContainerStarted","Data":"e626a43a0c3dc4ce7163640a27d2183fa55affb0121f41fcdcc7b65f7c064645"} Apr 22 19:23:45.418514 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.418464 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-198.ec2.internal" podStartSLOduration=2.418448727 podStartE2EDuration="2.418448727s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:45.416711175 +0000 UTC m=+3.610905762" watchObservedRunningTime="2026-04-22 19:23:45.418448727 +0000 UTC m=+3.612643314" Apr 22 19:23:45.437612 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.437287 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:45.864983 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:45.864930 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:45.865155 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.865083 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:45.865155 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:45.865142 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:47.865124586 +0000 UTC m=+6.059319154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:46.066053 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:46.065975 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:46.066210 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:46.066157 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:46.066210 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:46.066177 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:46.066210 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:46.066190 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:46.066351 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:46.066243 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.066226678 +0000 UTC m=+6.260421247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:46.366877 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:46.366328 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:46.366877 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:46.366446 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:46.418940 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:46.418881 2564 generic.go:358] "Generic (PLEG): container finished" podID="9710bcde2b9428f4968b64f229b324a2" containerID="d183e5eac78f775c1353fb766c8e4f00086b72682f9a0c94db09c9b7ce457d31" exitCode=0 Apr 22 19:23:46.420013 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:46.419785 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" event={"ID":"9710bcde2b9428f4968b64f229b324a2","Type":"ContainerDied","Data":"d183e5eac78f775c1353fb766c8e4f00086b72682f9a0c94db09c9b7ce457d31"} Apr 22 19:23:47.363788 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:47.363756 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:47.363965 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:47.363896 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:47.425606 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:47.424954 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" event={"ID":"9710bcde2b9428f4968b64f229b324a2","Type":"ContainerStarted","Data":"7c8455b132eb56a03f64986d09adfbc7c3b3fb87f7a9e1fadd50a37a73a75ceb"} Apr 22 19:23:47.882425 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:47.882390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:47.882680 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:47.882653 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:47.882785 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:47.882738 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:51.882719375 +0000 UTC m=+10.076913942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.084789 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:48.084166 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:48.084789 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:48.084319 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:48.084789 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:48.084337 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:48.084789 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:48.084350 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.084789 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:48.084416 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.084390078 +0000 UTC m=+10.278584646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.364516 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:48.364295 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:48.364516 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:48.364404 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:49.363588 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:49.363108 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:49.363588 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:49.363251 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:50.364278 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:50.363922 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:50.364278 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:50.364057 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:51.363496 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:51.363355 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:51.363496 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:51.363485 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:51.916028 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:51.915989 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:51.916489 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:51.916151 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:51.916489 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:51.916219 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.916198105 +0000 UTC m=+18.110392673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.118368 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:52.118334 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:52.118564 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:52.118515 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:52.118564 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:52.118532 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:52.118564 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:52.118545 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:52.118767 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:52.118601 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:00.118582541 +0000 UTC m=+18.312777110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:52.364859 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:52.364781 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:52.365008 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:52.364899 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:53.363292 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:53.363256 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:53.363836 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:53.363373 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:54.363466 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:54.363430 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:54.363912 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:54.363532 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:55.363827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:55.363795 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:55.364220 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:55.363924 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:56.363802 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:56.363767 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:56.364026 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:56.363883 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:57.363267 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:57.363189 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:57.363425 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:57.363311 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:58.363240 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:58.363205 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:23:58.363726 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:58.363307 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:23:59.363450 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:59.363411 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:59.364030 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:59.363550 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:23:59.974063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:23:59.974024 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:23:59.974209 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:59.974135 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:59.974209 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:23:59.974189 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:15.974175025 +0000 UTC m=+34.168369590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:00.175612 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:00.175570 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:00.175851 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:00.175742 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:00.175851 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:00.175759 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:00.175851 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:00.175770 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:00.175851 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:00.175825 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.175808287 +0000 UTC m=+34.370002871 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:00.364176 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:00.364077 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:00.364621 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:00.364260 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:01.363525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:01.363488 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:01.363671 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:01.363597 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:02.364495 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.364184 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:02.365179 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:02.364579 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:02.453074 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453001 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:24:02.453415 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453387 2564 generic.go:358] "Generic (PLEG): container finished" podID="90687231-1b2c-4845-9e57-cab76563d259" containerID="64e21589a4050738f015a6f85340cacd7153de3e82e244b8d9028dfdaf57b9fe" exitCode=1 Apr 22 19:24:02.453496 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453471 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"fd3f509e3850d03e67a0b17cb5edc66b582a16562672b9f8bb05af52fc6849c2"} Apr 22 19:24:02.453545 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453508 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"7b077d17e458ee2b6e99482069fe8757e7b67cc8ec46d099548799c45332809b"} Apr 22 19:24:02.453545 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453522 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"8c4c0517c59788f542287416549a763ba483589d1f257ee8e08461ff8ba3493e"} Apr 22 19:24:02.453545 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453534 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"1bec9f14788f12d67840c6f269200d3fbed0c90f747eb03cd53858d7f2873236"} Apr 22 19:24:02.453653 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453553 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerDied","Data":"64e21589a4050738f015a6f85340cacd7153de3e82e244b8d9028dfdaf57b9fe"} Apr 22 19:24:02.453653 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.453567 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"baf5b9be4ead9c86543ad95593c88fdf5fa7397d4fcc11507793bc4da5e27c2f"} Apr 22 19:24:02.454829 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.454795 2564 generic.go:358] "Generic (PLEG): container finished" podID="41bc5667-8d3d-482d-9edb-6340167eb814" containerID="c4699290cceed02954869eab21c36e5033e2685eba020bf7d27f7573472485dc" exitCode=0 Apr 22 19:24:02.454933 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.454863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerDied","Data":"c4699290cceed02954869eab21c36e5033e2685eba020bf7d27f7573472485dc"} Apr 22 19:24:02.456771 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.456733 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7dhdx" event={"ID":"38bec921-89a6-4a82-b51d-20431c5dedc1","Type":"ContainerStarted","Data":"8e131ef303b8f79336c5765d4411775c3676abb9a54c75837f73c5d669570849"} Apr 22 19:24:02.459494 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.459460 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" event={"ID":"fb1d2f05-e6fc-4b8c-a646-fdebb0847854","Type":"ContainerStarted","Data":"deb0854fe26ae6735e11a2c8307c708461df902e89aff908bda913defe898585"} Apr 22 19:24:02.463904 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.463880 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-88jfr" event={"ID":"60db5cd9-d42e-4ebb-b880-d777700e74ea","Type":"ContainerStarted","Data":"f5fabbb3cabd90a22e384bf75aea77f2fbae1690327f2a07ecc05f898e3edfda"} Apr 22 19:24:02.465357 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.465333 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" event={"ID":"1987be77-a025-44d0-b506-5a4bb7b2c605","Type":"ContainerStarted","Data":"eb9b5c881263a10e0e63af7b7bd6a3ff99ca74e2a58d560348b45700c1ae2418"} Apr 22 19:24:02.466688 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.466665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v22lp" event={"ID":"92374958-e1ad-48ca-bdd6-3c9a98c2e9e3","Type":"ContainerStarted","Data":"6c49000aaba3efec1029c0afe26a165b92a8c70950ec943546d09033c4a28243"} Apr 22 19:24:02.468439 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.468407 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2222w" event={"ID":"5092e3d5-3682-4a0c-bf3d-5313cc838278","Type":"ContainerStarted","Data":"3f6cbe60618591e300b342e61a4ab14da5dcd57acc0003ea34adab5c5585c532"} Apr 22 19:24:02.479337 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.479298 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-198.ec2.internal" podStartSLOduration=19.479288312 podStartE2EDuration="19.479288312s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:47.449729489 +0000 UTC m=+5.643924075" watchObservedRunningTime="2026-04-22 19:24:02.479288312 +0000 UTC m=+20.673482900" Apr 22 19:24:02.495478 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.495438 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hvwp6" podStartSLOduration=3.762184385 podStartE2EDuration="20.495425501s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.011854258 +0000 UTC m=+3.206048825" lastFinishedPulling="2026-04-22 19:24:01.745095375 +0000 UTC m=+19.939289941" observedRunningTime="2026-04-22 19:24:02.495032413 +0000 UTC m=+20.689226997" watchObservedRunningTime="2026-04-22 19:24:02.495425501 +0000 UTC m=+20.689620087" Apr 22 19:24:02.509679 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.509632 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7dhdx" podStartSLOduration=3.776836684 podStartE2EDuration="20.509617521s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.013789846 +0000 UTC m=+3.207984411" lastFinishedPulling="2026-04-22 19:24:01.746570683 +0000 UTC m=+19.940765248" observedRunningTime="2026-04-22 19:24:02.509479057 +0000 UTC m=+20.703673644" watchObservedRunningTime="2026-04-22 19:24:02.509617521 +0000 UTC m=+20.703812109" Apr 22 19:24:02.540393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.540335 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v22lp" podStartSLOduration=8.391473847 podStartE2EDuration="20.540317583s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.014778053 +0000 UTC m=+3.208972626" lastFinishedPulling="2026-04-22 19:23:57.163621784 +0000 UTC m=+15.357816362" observedRunningTime="2026-04-22 19:24:02.523980504 +0000 UTC m=+20.718175091" watchObservedRunningTime="2026-04-22 19:24:02.540317583 +0000 UTC m=+20.734512172" Apr 22 19:24:02.540557 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.540461 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2222w" podStartSLOduration=3.653421157 podStartE2EDuration="20.540455549s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.015917778 +0000 UTC m=+3.210112350" lastFinishedPulling="2026-04-22 19:24:01.902952173 +0000 UTC m=+20.097146742" observedRunningTime="2026-04-22 19:24:02.540004022 +0000 UTC m=+20.734198609" watchObservedRunningTime="2026-04-22 19:24:02.540455549 +0000 UTC m=+20.734650138" Apr 22 19:24:02.555362 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.555316 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-88jfr" podStartSLOduration=3.897869895 podStartE2EDuration="20.555299504s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.041741538 +0000 UTC m=+3.235936106" lastFinishedPulling="2026-04-22 19:24:01.69917114 +0000 UTC m=+19.893365715" observedRunningTime="2026-04-22 19:24:02.554951173 +0000 UTC m=+20.749145762" watchObservedRunningTime="2026-04-22 19:24:02.555299504 +0000 UTC m=+20.749494091" Apr 22 19:24:02.879609 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:02.879588 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:24:03.197483 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.197455 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v22lp_92374958-e1ad-48ca-bdd6-3c9a98c2e9e3/node-ca/0.log" Apr 22 19:24:03.303482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.303363 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:24:02.879604863Z","UUID":"601b746a-d2cb-4ef8-976c-09962a30d076","Handler":null,"Name":"","Endpoint":""} Apr 22 19:24:03.305177 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.305146 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:24:03.305177 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.305178 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:24:03.363099 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.363069 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:03.363252 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:03.363206 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:03.472275 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.472193 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rfw29" event={"ID":"d46c700e-0847-4daf-bd26-4b29c5bee728","Type":"ContainerStarted","Data":"1ee8e420437ce917ac44eef03bd0ef839001153a06385a6f6ed3b652105c99ae"} Apr 22 19:24:03.474920 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.474877 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" event={"ID":"1987be77-a025-44d0-b506-5a4bb7b2c605","Type":"ContainerStarted","Data":"3b2213ced707fbc805e156d90fe22a9a2a03bd65bf3a8a928ff33a35d8d65759"} Apr 22 19:24:03.488665 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:03.488614 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rfw29" podStartSLOduration=4.758210581 podStartE2EDuration="21.488599908s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.014651759 +0000 UTC m=+3.208846335" lastFinishedPulling="2026-04-22 19:24:01.745041081 +0000 UTC m=+19.939235662" observedRunningTime="2026-04-22 19:24:03.488390694 +0000 UTC m=+21.682585270" watchObservedRunningTime="2026-04-22 19:24:03.488599908 +0000 UTC m=+21.682794534" Apr 22 19:24:04.363988 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:04.363946 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:04.364262 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:04.364075 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:04.478982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:04.478724 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" event={"ID":"1987be77-a025-44d0-b506-5a4bb7b2c605","Type":"ContainerStarted","Data":"1f09f5c471dc279c7b6e6f3005d6335152e2e3a7241634ddf486a8d902240d32"} Apr 22 19:24:04.481770 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:04.481746 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:24:04.482125 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:04.482097 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"f3063cbb1c8e27b1259167cce5cbb83a9869e7d4fde2201259af47de307773cb"} Apr 22 19:24:04.512290 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:04.512242 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k8tpt" podStartSLOduration=3.92819314 podStartE2EDuration="22.512227904s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.041885794 +0000 UTC m=+3.236080364" lastFinishedPulling="2026-04-22 19:24:03.625920549 +0000 UTC m=+21.820115128" observedRunningTime="2026-04-22 19:24:04.511707191 +0000 UTC m=+22.705901769" watchObservedRunningTime="2026-04-22 19:24:04.512227904 +0000 UTC m=+22.706422487" Apr 22 19:24:05.363493 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:05.363455 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:05.363742 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:05.363577 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:06.363331 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:06.363296 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:06.363907 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:06.363424 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:06.652064 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:06.652010 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:24:06.652985 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:06.652950 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:24:07.363799 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.363574 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:07.364506 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:07.363894 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:07.489791 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.489766 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:24:07.490056 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.490037 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"a89f0745119b8ab9d3029c1b975cd392a73736eb54a8b9432627bfd47609e84a"} Apr 22 19:24:07.490372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.490342 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:24:07.490372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.490369 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:24:07.490623 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.490380 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:24:07.490623 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.490549 2564 scope.go:117] "RemoveContainer" containerID="64e21589a4050738f015a6f85340cacd7153de3e82e244b8d9028dfdaf57b9fe" Apr 22 19:24:07.491872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.491853 2564 generic.go:358] "Generic (PLEG): container finished" podID="41bc5667-8d3d-482d-9edb-6340167eb814" containerID="8fb581fa6106682e259cc9e2db12bff9a4ae2e6fcb3ac14e22b137cc35779026" exitCode=0 Apr 22 19:24:07.491973 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.491927 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerDied","Data":"8fb581fa6106682e259cc9e2db12bff9a4ae2e6fcb3ac14e22b137cc35779026"} Apr 22 19:24:07.492232 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.492205 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:24:07.492841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.492733 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-88jfr" Apr 22 19:24:07.506392 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.506373 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:24:07.506494 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:07.506484 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:24:08.363475 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:08.363443 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:08.363625 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:08.363565 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:08.498486 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:08.496347 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:24:08.498966 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:08.498927 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" event={"ID":"90687231-1b2c-4845-9e57-cab76563d259","Type":"ContainerStarted","Data":"fe239c07c34161a46cbc30a09da23b93f64ed129248d9323e8e84d761535c765"} Apr 22 19:24:08.500628 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:08.500608 2564 generic.go:358] "Generic (PLEG): container finished" podID="41bc5667-8d3d-482d-9edb-6340167eb814" containerID="58dd17fdde9cd7ec50dde0fbcdc7d0c014cb8ca64391855a35a47cff6a6aa1e5" exitCode=0 Apr 22 19:24:08.500739 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:08.500704 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerDied","Data":"58dd17fdde9cd7ec50dde0fbcdc7d0c014cb8ca64391855a35a47cff6a6aa1e5"} Apr 22 19:24:08.531371 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:08.531324 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" podStartSLOduration=9.788776023 podStartE2EDuration="26.531312641s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.041886774 +0000 UTC m=+3.236081345" lastFinishedPulling="2026-04-22 19:24:01.784423398 +0000 UTC m=+19.978617963" observedRunningTime="2026-04-22 19:24:08.529893325 +0000 UTC m=+26.724087912" watchObservedRunningTime="2026-04-22 19:24:08.531312641 +0000 UTC m=+26.725507228" Apr 22 19:24:09.363551 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:09.363479 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:09.363685 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:09.363577 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:09.503755 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:09.503724 2564 generic.go:358] "Generic (PLEG): container finished" podID="41bc5667-8d3d-482d-9edb-6340167eb814" containerID="2964525df578d96091e5cc8e3cf4f4eeff630c1f2c7eb442c6ca1d3137f00376" exitCode=0 Apr 22 19:24:09.504155 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:09.503800 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerDied","Data":"2964525df578d96091e5cc8e3cf4f4eeff630c1f2c7eb442c6ca1d3137f00376"} Apr 22 19:24:10.363839 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:10.363798 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:10.364000 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:10.363900 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:11.364085 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:11.364049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:11.364519 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:11.364166 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:12.364448 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:12.364410 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:12.364904 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:12.364529 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:13.363230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:13.363194 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:13.363398 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:13.363331 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:14.363817 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:14.363783 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:14.364228 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:14.363895 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:15.363412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:15.363383 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:15.363539 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:15.363501 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:15.517090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:15.517058 2564 generic.go:358] "Generic (PLEG): container finished" podID="41bc5667-8d3d-482d-9edb-6340167eb814" containerID="4950a3c1201415d60846a37bff6da2eb2008fbc5509a841afb801be405e41e1b" exitCode=0 Apr 22 19:24:15.517454 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:15.517101 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerDied","Data":"4950a3c1201415d60846a37bff6da2eb2008fbc5509a841afb801be405e41e1b"} Apr 22 19:24:15.990321 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:15.990238 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:15.990476 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:15.990341 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:15.990476 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:15.990394 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs podName:6f6d5518-179f-4f70-8c2c-5b1b2a244e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:47.990380068 +0000 UTC m=+66.184574633 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs") pod "network-metrics-daemon-jblt6" (UID: "6f6d5518-179f-4f70-8c2c-5b1b2a244e38") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:16.191269 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:16.191089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:16.191451 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:16.191238 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:16.191451 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:16.191304 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:16.191451 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:16.191316 2564 projected.go:194] Error preparing data for projected volume kube-api-access-kc67s for pod openshift-network-diagnostics/network-check-target-m4bzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:16.191451 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:16.191364 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s podName:95433104-c840-4e8f-a3ff-c645c636f399 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:48.191350782 +0000 UTC m=+66.385545347 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kc67s" (UniqueName: "kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s") pod "network-check-target-m4bzg" (UID: "95433104-c840-4e8f-a3ff-c645c636f399") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:16.363647 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:16.363571 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:16.363813 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:16.363663 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:16.521140 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:16.521106 2564 generic.go:358] "Generic (PLEG): container finished" podID="41bc5667-8d3d-482d-9edb-6340167eb814" containerID="6d1b42404e81ec378bc4317c0ccd92826c2e7e325262cda2610ef47ad20d2a44" exitCode=0 Apr 22 19:24:16.521505 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:16.521164 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerDied","Data":"6d1b42404e81ec378bc4317c0ccd92826c2e7e325262cda2610ef47ad20d2a44"} Apr 22 19:24:17.363163 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:17.363137 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:17.363323 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:17.363239 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:17.524948 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:17.524920 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" event={"ID":"41bc5667-8d3d-482d-9edb-6340167eb814","Type":"ContainerStarted","Data":"baf6b7f021fce8e83cff4148a6f7e68ea557a3078b94b07916727f72e853c52a"} Apr 22 19:24:17.550065 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:17.550022 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8vbtk" podStartSLOduration=5.531862049 podStartE2EDuration="35.550008595s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.036928708 +0000 UTC m=+3.231123273" lastFinishedPulling="2026-04-22 19:24:15.055075236 +0000 UTC m=+33.249269819" observedRunningTime="2026-04-22 19:24:17.548230078 +0000 UTC m=+35.742424666" watchObservedRunningTime="2026-04-22 19:24:17.550008595 +0000 UTC m=+35.744203181" Apr 22 19:24:18.363980 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:18.363940 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:18.364143 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:18.364070 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:19.363932 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:19.363897 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:19.364299 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:19.364009 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:20.363299 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:20.363254 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:20.363496 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:20.363357 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:21.363866 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:21.363833 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:21.364229 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:21.363940 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:22.363782 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:22.363750 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:22.363936 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:22.363825 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:23.363809 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:23.363778 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:23.363982 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:23.363876 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:24.363827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:24.363797 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:24.363992 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:24.363912 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:25.363220 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:25.363181 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:25.363417 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:25.363288 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:26.363178 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:26.363141 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:26.363801 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:26.363278 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:26.525541 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:26.525510 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m4bzg"] Apr 22 19:24:26.527911 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:26.527885 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jblt6"] Apr 22 19:24:26.528010 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:26.527998 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:26.528146 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:26.528121 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:26.540948 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:26.540918 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:26.541065 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:26.541020 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:28.363077 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:28.363049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:28.363077 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:28.363073 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:28.363560 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:28.363145 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:28.363560 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:28.363286 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:30.363790 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.363760 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:30.363790 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.363779 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:30.364177 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:30.363864 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jblt6" podUID="6f6d5518-179f-4f70-8c2c-5b1b2a244e38" Apr 22 19:24:30.364177 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:30.363968 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m4bzg" podUID="95433104-c840-4e8f-a3ff-c645c636f399" Apr 22 19:24:30.646852 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.646612 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-198.ec2.internal" event="NodeReady" Apr 22 19:24:30.646979 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.646886 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:30.713160 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.713128 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k5rbc"] Apr 22 19:24:30.725668 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.725646 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dq5b5"] Apr 22 19:24:30.725789 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.725779 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:30.729268 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.729249 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wknft\"" Apr 22 19:24:30.729268 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.729261 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:30.729462 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.729445 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:30.729502 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.729451 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:30.739610 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.739591 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:30.739857 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.739822 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k5rbc"] Apr 22 19:24:30.744092 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.744073 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dq5b5"] Apr 22 19:24:30.744660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.744639 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bspvd\"" Apr 22 19:24:30.745009 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.744996 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:30.745072 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.745007 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:30.751171 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.751153 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nm6zf"] Apr 22 19:24:30.778962 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.778944 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nm6zf"] Apr 22 19:24:30.779062 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.779051 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:30.783491 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.783460 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:24:30.783738 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.783713 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pwghq\"" Apr 22 19:24:30.783738 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.783731 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:24:30.783872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.783743 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:24:30.783872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.783778 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:24:30.900114 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900005 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3d4cf343-beba-4351-919c-473d70ddfbc4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:30.900114 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900039 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3d4cf343-beba-4351-919c-473d70ddfbc4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:30.900114 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900065 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/341073cd-9280-4d10-acb9-b1c0b32e7850-config-volume\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:30.900114 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900099 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvc7\" (UniqueName: \"kubernetes.io/projected/341073cd-9280-4d10-acb9-b1c0b32e7850-kube-api-access-sjvc7\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900165 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64bg6\" (UniqueName: \"kubernetes.io/projected/3d4cf343-beba-4351-919c-473d70ddfbc4-kube-api-access-64bg6\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900186 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/341073cd-9280-4d10-acb9-b1c0b32e7850-tmp-dir\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900207 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkv7\" (UniqueName: \"kubernetes.io/projected/5bcafd60-0344-423e-88a2-7e2ffae0f188-kube-api-access-2qkv7\") pod \"ingress-canary-k5rbc\" (UID: \"5bcafd60-0344-423e-88a2-7e2ffae0f188\") " pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900224 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5bcafd60-0344-423e-88a2-7e2ffae0f188-cert\") pod \"ingress-canary-k5rbc\" (UID: \"5bcafd60-0344-423e-88a2-7e2ffae0f188\") " pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900241 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d4cf343-beba-4351-919c-473d70ddfbc4-data-volume\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900257 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/341073cd-9280-4d10-acb9-b1c0b32e7850-metrics-tls\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:30.900366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:30.900274 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3d4cf343-beba-4351-919c-473d70ddfbc4-crio-socket\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001321 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001287 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64bg6\" (UniqueName: \"kubernetes.io/projected/3d4cf343-beba-4351-919c-473d70ddfbc4-kube-api-access-64bg6\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001321 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001325 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/341073cd-9280-4d10-acb9-b1c0b32e7850-tmp-dir\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001344 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkv7\" (UniqueName: \"kubernetes.io/projected/5bcafd60-0344-423e-88a2-7e2ffae0f188-kube-api-access-2qkv7\") pod \"ingress-canary-k5rbc\" (UID: \"5bcafd60-0344-423e-88a2-7e2ffae0f188\") " pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001374 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5bcafd60-0344-423e-88a2-7e2ffae0f188-cert\") pod \"ingress-canary-k5rbc\" (UID: \"5bcafd60-0344-423e-88a2-7e2ffae0f188\") " pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001399 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d4cf343-beba-4351-919c-473d70ddfbc4-data-volume\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001423 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/341073cd-9280-4d10-acb9-b1c0b32e7850-metrics-tls\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3d4cf343-beba-4351-919c-473d70ddfbc4-crio-socket\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3d4cf343-beba-4351-919c-473d70ddfbc4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3d4cf343-beba-4351-919c-473d70ddfbc4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001849 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001544 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/341073cd-9280-4d10-acb9-b1c0b32e7850-config-volume\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.001849 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001567 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvc7\" (UniqueName: \"kubernetes.io/projected/341073cd-9280-4d10-acb9-b1c0b32e7850-kube-api-access-sjvc7\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.001849 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001710 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3d4cf343-beba-4351-919c-473d70ddfbc4-crio-socket\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.001849 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001733 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/341073cd-9280-4d10-acb9-b1c0b32e7850-tmp-dir\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.001849 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.001796 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d4cf343-beba-4351-919c-473d70ddfbc4-data-volume\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.002235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.002207 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/341073cd-9280-4d10-acb9-b1c0b32e7850-config-volume\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.005243 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.005224 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3d4cf343-beba-4351-919c-473d70ddfbc4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.005338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.005242 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/341073cd-9280-4d10-acb9-b1c0b32e7850-metrics-tls\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.005338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.005308 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5bcafd60-0344-423e-88a2-7e2ffae0f188-cert\") pod \"ingress-canary-k5rbc\" (UID: \"5bcafd60-0344-423e-88a2-7e2ffae0f188\") " pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:31.011918 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.011898 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3d4cf343-beba-4351-919c-473d70ddfbc4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.015223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.015201 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64bg6\" (UniqueName: \"kubernetes.io/projected/3d4cf343-beba-4351-919c-473d70ddfbc4-kube-api-access-64bg6\") pod \"insights-runtime-extractor-nm6zf\" (UID: \"3d4cf343-beba-4351-919c-473d70ddfbc4\") " pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.015318 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.015301 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvc7\" (UniqueName: \"kubernetes.io/projected/341073cd-9280-4d10-acb9-b1c0b32e7850-kube-api-access-sjvc7\") pod \"dns-default-dq5b5\" (UID: \"341073cd-9280-4d10-acb9-b1c0b32e7850\") " pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.016324 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.016299 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkv7\" (UniqueName: \"kubernetes.io/projected/5bcafd60-0344-423e-88a2-7e2ffae0f188-kube-api-access-2qkv7\") pod \"ingress-canary-k5rbc\" (UID: \"5bcafd60-0344-423e-88a2-7e2ffae0f188\") " pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:31.035165 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.035145 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5rbc" Apr 22 19:24:31.048874 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.048854 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:31.086975 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.086937 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nm6zf" Apr 22 19:24:31.215345 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.215314 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k5rbc"] Apr 22 19:24:31.218226 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.218198 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dq5b5"] Apr 22 19:24:31.219425 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:31.219402 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bcafd60_0344_423e_88a2_7e2ffae0f188.slice/crio-cdb8f4ecd3342daa1860ac4541419eb6d7e1592349a204da412ec6242340cfc9 WatchSource:0}: Error finding container cdb8f4ecd3342daa1860ac4541419eb6d7e1592349a204da412ec6242340cfc9: Status 404 returned error can't find the container with id cdb8f4ecd3342daa1860ac4541419eb6d7e1592349a204da412ec6242340cfc9 Apr 22 19:24:31.221532 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:31.221506 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341073cd_9280_4d10_acb9_b1c0b32e7850.slice/crio-f3041aecaa04edc9afe87336c29146bb98b1c117caf941423a0cfc2cee5fc559 WatchSource:0}: Error finding container f3041aecaa04edc9afe87336c29146bb98b1c117caf941423a0cfc2cee5fc559: Status 404 returned error can't find the container with id f3041aecaa04edc9afe87336c29146bb98b1c117caf941423a0cfc2cee5fc559 Apr 22 19:24:31.239346 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.239321 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nm6zf"] Apr 22 19:24:31.242311 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:31.242289 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d4cf343_beba_4351_919c_473d70ddfbc4.slice/crio-a219c1515c41074657b8c337a7759876e37f706a2ae547396ab3eadbf7681121 WatchSource:0}: Error finding container a219c1515c41074657b8c337a7759876e37f706a2ae547396ab3eadbf7681121: Status 404 returned error can't find the container with id a219c1515c41074657b8c337a7759876e37f706a2ae547396ab3eadbf7681121 Apr 22 19:24:31.550918 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.550882 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nm6zf" event={"ID":"3d4cf343-beba-4351-919c-473d70ddfbc4","Type":"ContainerStarted","Data":"b4691b4e445a06ec7ca536c5d316f0ca7f53d65e23474ad6c4bfeaf2f1747baf"} Apr 22 19:24:31.550918 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.550921 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nm6zf" event={"ID":"3d4cf343-beba-4351-919c-473d70ddfbc4","Type":"ContainerStarted","Data":"a219c1515c41074657b8c337a7759876e37f706a2ae547396ab3eadbf7681121"} Apr 22 19:24:31.551753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.551732 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dq5b5" event={"ID":"341073cd-9280-4d10-acb9-b1c0b32e7850","Type":"ContainerStarted","Data":"f3041aecaa04edc9afe87336c29146bb98b1c117caf941423a0cfc2cee5fc559"} Apr 22 19:24:31.552469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:31.552450 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k5rbc" event={"ID":"5bcafd60-0344-423e-88a2-7e2ffae0f188","Type":"ContainerStarted","Data":"cdb8f4ecd3342daa1860ac4541419eb6d7e1592349a204da412ec6242340cfc9"} Apr 22 19:24:32.371216 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.371049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:32.371533 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.371129 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:32.375081 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.375059 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:32.375230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.375072 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7c2z6\"" Apr 22 19:24:32.375299 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.375071 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqt8b\"" Apr 22 19:24:32.375299 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.375075 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:32.375397 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:32.375128 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:33.558621 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.558581 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dq5b5" event={"ID":"341073cd-9280-4d10-acb9-b1c0b32e7850","Type":"ContainerStarted","Data":"6f893254727f892db1b91b399ac65f36a2ddc049a1d901c9230e405c21c3fa60"} Apr 22 19:24:33.558621 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.558622 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dq5b5" event={"ID":"341073cd-9280-4d10-acb9-b1c0b32e7850","Type":"ContainerStarted","Data":"f9cbfbf7623037a1c727674d7c6063ec51653d00ac6331aa7959d7a886598e96"} Apr 22 19:24:33.559436 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.558714 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:33.559915 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.559893 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k5rbc" event={"ID":"5bcafd60-0344-423e-88a2-7e2ffae0f188","Type":"ContainerStarted","Data":"7761a69603bc71490d811fbdd967620a4f47d2e255b3c377ab790d346a543e0b"} Apr 22 19:24:33.561257 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.561237 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nm6zf" event={"ID":"3d4cf343-beba-4351-919c-473d70ddfbc4","Type":"ContainerStarted","Data":"337d40e2b3b61a151099f24a879e36d09919bd6a422fe1e94987636a180a0c93"} Apr 22 19:24:33.576248 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.576210 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dq5b5" podStartSLOduration=1.568744521 podStartE2EDuration="3.576199274s" podCreationTimestamp="2026-04-22 19:24:30 +0000 UTC" firstStartedPulling="2026-04-22 19:24:31.223254573 +0000 UTC m=+49.417449138" lastFinishedPulling="2026-04-22 19:24:33.23070931 +0000 UTC m=+51.424903891" observedRunningTime="2026-04-22 19:24:33.5749021 +0000 UTC m=+51.769096698" watchObservedRunningTime="2026-04-22 19:24:33.576199274 +0000 UTC m=+51.770393860" Apr 22 19:24:33.589606 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:33.589567 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k5rbc" podStartSLOduration=1.5754345669999998 podStartE2EDuration="3.589554598s" podCreationTimestamp="2026-04-22 19:24:30 +0000 UTC" firstStartedPulling="2026-04-22 19:24:31.221537964 +0000 UTC m=+49.415732529" lastFinishedPulling="2026-04-22 19:24:33.235657982 +0000 UTC m=+51.429852560" observedRunningTime="2026-04-22 19:24:33.58918978 +0000 UTC m=+51.783384367" watchObservedRunningTime="2026-04-22 19:24:33.589554598 +0000 UTC m=+51.783749163" Apr 22 19:24:34.506614 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.506391 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-4tjh8"] Apr 22 19:24:34.511173 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.511151 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.515170 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.515127 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:24:34.515170 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.515170 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:24:34.515351 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.515127 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:24:34.515411 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.515362 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:24:34.515841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.515535 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-b8vtg\"" Apr 22 19:24:34.515841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.515628 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:24:34.518743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.518666 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-4tjh8"] Apr 22 19:24:34.628412 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.628371 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb4fa87-e378-4de8-8930-a788bc72560c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.628777 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.628415 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bb4fa87-e378-4de8-8930-a788bc72560c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.628777 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.628448 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7bb4fa87-e378-4de8-8930-a788bc72560c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.628777 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.628495 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzr5k\" (UniqueName: \"kubernetes.io/projected/7bb4fa87-e378-4de8-8930-a788bc72560c-kube-api-access-zzr5k\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.729717 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.729663 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb4fa87-e378-4de8-8930-a788bc72560c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.729890 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.729734 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bb4fa87-e378-4de8-8930-a788bc72560c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.729951 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.729887 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7bb4fa87-e378-4de8-8930-a788bc72560c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.729951 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.729931 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzr5k\" (UniqueName: \"kubernetes.io/projected/7bb4fa87-e378-4de8-8930-a788bc72560c-kube-api-access-zzr5k\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.730540 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.730516 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb4fa87-e378-4de8-8930-a788bc72560c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.733521 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.733499 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7bb4fa87-e378-4de8-8930-a788bc72560c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.733618 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.733571 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bb4fa87-e378-4de8-8930-a788bc72560c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.738832 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.738808 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzr5k\" (UniqueName: \"kubernetes.io/projected/7bb4fa87-e378-4de8-8930-a788bc72560c-kube-api-access-zzr5k\") pod \"prometheus-operator-5676c8c784-4tjh8\" (UID: \"7bb4fa87-e378-4de8-8930-a788bc72560c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.822803 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.822770 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" Apr 22 19:24:34.932853 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:34.932823 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-4tjh8"] Apr 22 19:24:34.935559 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:34.935540 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb4fa87_e378_4de8_8930_a788bc72560c.slice/crio-af5eda827388d880cd1ed003d0b60d0d2f1ca40289e4033e549fd9e08a928be6 WatchSource:0}: Error finding container af5eda827388d880cd1ed003d0b60d0d2f1ca40289e4033e549fd9e08a928be6: Status 404 returned error can't find the container with id af5eda827388d880cd1ed003d0b60d0d2f1ca40289e4033e549fd9e08a928be6 Apr 22 19:24:35.573264 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:35.573223 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nm6zf" event={"ID":"3d4cf343-beba-4351-919c-473d70ddfbc4","Type":"ContainerStarted","Data":"6d13238799e0901250eb928aece923359b0ce67e4235c67024d78904baa9d810"} Apr 22 19:24:35.574372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:35.574341 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" event={"ID":"7bb4fa87-e378-4de8-8930-a788bc72560c","Type":"ContainerStarted","Data":"af5eda827388d880cd1ed003d0b60d0d2f1ca40289e4033e549fd9e08a928be6"} Apr 22 19:24:35.592232 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:35.592186 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nm6zf" podStartSLOduration=2.335856303 podStartE2EDuration="5.592171308s" podCreationTimestamp="2026-04-22 19:24:30 +0000 UTC" firstStartedPulling="2026-04-22 19:24:31.34371511 +0000 UTC m=+49.537909675" lastFinishedPulling="2026-04-22 19:24:34.600030115 +0000 UTC m=+52.794224680" observedRunningTime="2026-04-22 19:24:35.590546475 +0000 UTC m=+53.784741085" watchObservedRunningTime="2026-04-22 19:24:35.592171308 +0000 UTC m=+53.786365886" Apr 22 19:24:36.578768 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:36.578731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" event={"ID":"7bb4fa87-e378-4de8-8930-a788bc72560c","Type":"ContainerStarted","Data":"3e355c64afcf2a14fdfc6d389c6f3baa5a0b79855ab4dc2ee4614e93367682b2"} Apr 22 19:24:36.578768 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:36.578773 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" event={"ID":"7bb4fa87-e378-4de8-8930-a788bc72560c","Type":"ContainerStarted","Data":"f6bb8472eddaff8ac7dc3194828157a4fddb1b3adf07fa435cb58ad130720246"} Apr 22 19:24:36.595311 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:36.595258 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-4tjh8" podStartSLOduration=1.356566178 podStartE2EDuration="2.595243034s" podCreationTimestamp="2026-04-22 19:24:34 +0000 UTC" firstStartedPulling="2026-04-22 19:24:34.937377455 +0000 UTC m=+53.131572020" lastFinishedPulling="2026-04-22 19:24:36.17605431 +0000 UTC m=+54.370248876" observedRunningTime="2026-04-22 19:24:36.594657093 +0000 UTC m=+54.788851680" watchObservedRunningTime="2026-04-22 19:24:36.595243034 +0000 UTC m=+54.789437621" Apr 22 19:24:37.857469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.857437 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78744977db-wmsfk"] Apr 22 19:24:37.862088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.862072 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.866137 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866105 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:24:37.866137 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866126 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:24:37.866319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866142 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:24:37.866319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866152 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:24:37.866319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866163 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:24:37.866319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866106 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-skskp\"" Apr 22 19:24:37.866319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866294 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:24:37.866563 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.866436 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:24:37.870822 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.870805 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:24:37.873191 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.873034 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78744977db-wmsfk"] Apr 22 19:24:37.950337 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950309 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-oauth-serving-cert\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.950337 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950340 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw95f\" (UniqueName: \"kubernetes.io/projected/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-kube-api-access-fw95f\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.950517 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950445 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-oauth-config\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.950517 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950471 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-config\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.950517 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950492 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-serving-cert\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.950517 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950510 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-service-ca\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:37.950645 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:37.950525 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-trusted-ca-bundle\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.052643 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.052615 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-oauth-config\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.052798 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.052648 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-config\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.052798 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.052670 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-serving-cert\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.052798 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.052759 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-service-ca\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.052957 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.052797 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-trusted-ca-bundle\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.052957 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.052838 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-oauth-serving-cert\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.053066 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.053039 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw95f\" (UniqueName: \"kubernetes.io/projected/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-kube-api-access-fw95f\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.053505 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.053411 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-config\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.053985 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.053437 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-service-ca\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.054097 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.054025 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-oauth-serving-cert\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.054097 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.054077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-trusted-ca-bundle\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.057504 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.055790 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-serving-cert\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.057504 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.055798 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-oauth-config\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.062188 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.062167 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw95f\" (UniqueName: \"kubernetes.io/projected/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-kube-api-access-fw95f\") pod \"console-78744977db-wmsfk\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.170663 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.170590 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:38.278094 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.278064 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78744977db-wmsfk"] Apr 22 19:24:38.281066 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:38.281032 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf2c7d5_655c_413d_bdb2_cb1a7da45620.slice/crio-6c66752aa7257f19f10dedab1de0061c9cdefdc339ce27779f77fa533f931c0b WatchSource:0}: Error finding container 6c66752aa7257f19f10dedab1de0061c9cdefdc339ce27779f77fa533f931c0b: Status 404 returned error can't find the container with id 6c66752aa7257f19f10dedab1de0061c9cdefdc339ce27779f77fa533f931c0b Apr 22 19:24:38.587593 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.587563 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78744977db-wmsfk" event={"ID":"bcf2c7d5-655c-413d-bdb2-cb1a7da45620","Type":"ContainerStarted","Data":"6c66752aa7257f19f10dedab1de0061c9cdefdc339ce27779f77fa533f931c0b"} Apr 22 19:24:38.868001 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.867927 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5"] Apr 22 19:24:38.884760 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.884734 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5"] Apr 22 19:24:38.884953 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.884853 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:38.888569 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.888540 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:24:38.888688 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.888591 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-r6p6j\"" Apr 22 19:24:38.888688 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.888540 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 19:24:38.897085 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.897065 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-w5v7d"] Apr 22 19:24:38.912753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.912731 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:38.916429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.916406 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:24:38.916552 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.916529 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-42bwr\"" Apr 22 19:24:38.916958 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.916804 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:24:38.917302 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.917217 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:24:38.920308 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.920270 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-w5v7d"] Apr 22 19:24:38.930931 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.930909 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9lsf9"] Apr 22 19:24:38.951916 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.951893 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.954358 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.954332 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:24:38.954554 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.954531 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kkgwm\"" Apr 22 19:24:38.954637 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.954531 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:24:38.954754 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.954736 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:24:38.958593 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958566 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-tls\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.958713 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958612 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xss85\" (UniqueName: \"kubernetes.io/projected/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-api-access-xss85\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:38.958713 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958644 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5322e235-8e02-4de9-8bd7-c1732a34c595-metrics-client-ca\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.958713 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958674 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72dc47d5-c6f5-467f-ae4f-27ae71a19818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958723 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-accelerators-collector-config\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958758 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6hfk\" (UniqueName: \"kubernetes.io/projected/5322e235-8e02-4de9-8bd7-c1732a34c595-kube-api-access-v6hfk\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958784 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-textfile\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958816 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-sys\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958844 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c25b062-ea8e-4f67-b431-931fbd0173f4-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958872 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958896 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-root\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958917 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-wtmp\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958940 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/72dc47d5-c6f5-467f-ae4f-27ae71a19818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.958976 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:38.959015 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.959005 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:38.959741 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.959052 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72dc47d5-c6f5-467f-ae4f-27ae71a19818-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:38.959741 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.959078 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcr7\" (UniqueName: \"kubernetes.io/projected/72dc47d5-c6f5-467f-ae4f-27ae71a19818-kube-api-access-llcr7\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:38.959741 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.959106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5c25b062-ea8e-4f67-b431-931fbd0173f4-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:38.959741 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:38.959142 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.060441 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060355 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xss85\" (UniqueName: \"kubernetes.io/projected/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-api-access-xss85\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.060441 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5322e235-8e02-4de9-8bd7-c1732a34c595-metrics-client-ca\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.060441 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72dc47d5-c6f5-467f-ae4f-27ae71a19818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060474 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-accelerators-collector-config\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060505 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6hfk\" (UniqueName: \"kubernetes.io/projected/5322e235-8e02-4de9-8bd7-c1732a34c595-kube-api-access-v6hfk\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060531 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-textfile\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060842 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-sys\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060877 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c25b062-ea8e-4f67-b431-931fbd0173f4-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060911 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060930 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-textfile\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.060950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060936 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-root\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060959 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-wtmp\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.060983 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/72dc47d5-c6f5-467f-ae4f-27ae71a19818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061024 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061056 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061109 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72dc47d5-c6f5-467f-ae4f-27ae71a19818-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061109 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-sys\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061127 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-accelerators-collector-config\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061222 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-wtmp\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llcr7\" (UniqueName: \"kubernetes.io/projected/72dc47d5-c6f5-467f-ae4f-27ae71a19818-kube-api-access-llcr7\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061298 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5c25b062-ea8e-4f67-b431-931fbd0173f4-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061336 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061380 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-tls\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.061488 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:39.061492 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:24:39.062149 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:39.061559 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-tls podName:5322e235-8e02-4de9-8bd7-c1732a34c595 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:39.561540079 +0000 UTC m=+57.755734649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-tls") pod "node-exporter-9lsf9" (UID: "5322e235-8e02-4de9-8bd7-c1732a34c595") : secret "node-exporter-tls" not found Apr 22 19:24:39.062149 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.061797 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5322e235-8e02-4de9-8bd7-c1732a34c595-root\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.062253 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.062170 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5322e235-8e02-4de9-8bd7-c1732a34c595-metrics-client-ca\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.062305 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:39.062268 2564 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 19:24:39.062362 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:39.062315 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-tls podName:5c25b062-ea8e-4f67-b431-931fbd0173f4 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:39.562300669 +0000 UTC m=+57.756495237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-w5v7d" (UID: "5c25b062-ea8e-4f67-b431-931fbd0173f4") : secret "kube-state-metrics-tls" not found Apr 22 19:24:39.062362 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.062339 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5c25b062-ea8e-4f67-b431-931fbd0173f4-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.063028 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.063002 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72dc47d5-c6f5-467f-ae4f-27ae71a19818-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.063926 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.063904 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.064010 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.063959 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72dc47d5-c6f5-467f-ae4f-27ae71a19818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.064010 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.064002 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/72dc47d5-c6f5-467f-ae4f-27ae71a19818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.069081 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.069061 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xss85\" (UniqueName: \"kubernetes.io/projected/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-api-access-xss85\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.069179 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.069105 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6hfk\" (UniqueName: \"kubernetes.io/projected/5322e235-8e02-4de9-8bd7-c1732a34c595-kube-api-access-v6hfk\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.073298 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.073277 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.073416 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.073397 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c25b062-ea8e-4f67-b431-931fbd0173f4-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.073616 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.073595 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcr7\" (UniqueName: \"kubernetes.io/projected/72dc47d5-c6f5-467f-ae4f-27ae71a19818-kube-api-access-llcr7\") pod \"openshift-state-metrics-9d44df66c-b6rv5\" (UID: \"72dc47d5-c6f5-467f-ae4f-27ae71a19818\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.075763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.075743 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.196429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.196341 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" Apr 22 19:24:39.339557 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.339505 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5"] Apr 22 19:24:39.343242 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:39.343208 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dc47d5_c6f5_467f_ae4f_27ae71a19818.slice/crio-baffa2de8fd099ca57c1a86f50d38a8f0672f4cf4c5c1ebfc9bf6d21e4c78d81 WatchSource:0}: Error finding container baffa2de8fd099ca57c1a86f50d38a8f0672f4cf4c5c1ebfc9bf6d21e4c78d81: Status 404 returned error can't find the container with id baffa2de8fd099ca57c1a86f50d38a8f0672f4cf4c5c1ebfc9bf6d21e4c78d81 Apr 22 19:24:39.519302 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.519199 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrxcm" Apr 22 19:24:39.567278 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.566795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.567278 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.566860 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-tls\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.569678 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.569649 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5322e235-8e02-4de9-8bd7-c1732a34c595-node-exporter-tls\") pod \"node-exporter-9lsf9\" (UID: \"5322e235-8e02-4de9-8bd7-c1732a34c595\") " pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.569803 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.569747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c25b062-ea8e-4f67-b431-931fbd0173f4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w5v7d\" (UID: \"5c25b062-ea8e-4f67-b431-931fbd0173f4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.593192 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.593151 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" event={"ID":"72dc47d5-c6f5-467f-ae4f-27ae71a19818","Type":"ContainerStarted","Data":"bb15548ab7216d2a51439b93766dd4720be9fc9041a85ffcb8941f97405827ea"} Apr 22 19:24:39.593192 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.593191 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" event={"ID":"72dc47d5-c6f5-467f-ae4f-27ae71a19818","Type":"ContainerStarted","Data":"f74d5797fcf2b62aff81bd3a818e4176caeb1c4b48c82839b4ff858a0ee255d8"} Apr 22 19:24:39.593385 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.593203 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" event={"ID":"72dc47d5-c6f5-467f-ae4f-27ae71a19818","Type":"ContainerStarted","Data":"baffa2de8fd099ca57c1a86f50d38a8f0672f4cf4c5c1ebfc9bf6d21e4c78d81"} Apr 22 19:24:39.823553 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.823521 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" Apr 22 19:24:39.862580 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.862542 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9lsf9" Apr 22 19:24:39.872425 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:39.872373 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5322e235_8e02_4de9_8bd7_c1732a34c595.slice/crio-7b174156ff7289a9dce42bf06d1cb18f080443769df2169a0a743a9b7082c6af WatchSource:0}: Error finding container 7b174156ff7289a9dce42bf06d1cb18f080443769df2169a0a743a9b7082c6af: Status 404 returned error can't find the container with id 7b174156ff7289a9dce42bf06d1cb18f080443769df2169a0a743a9b7082c6af Apr 22 19:24:39.977556 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.977527 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-w5v7d"] Apr 22 19:24:39.979811 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.979786 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:24:39.983300 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.983279 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:39.987725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987516 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:24:39.987725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987585 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:24:39.987725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987596 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:24:39.987725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987637 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:24:39.987725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987640 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:24:39.987725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987687 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:24:39.988019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.987781 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:24:39.988088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.988072 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:24:39.988144 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.988074 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-pzjbf\"" Apr 22 19:24:39.988144 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.988073 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:24:39.998138 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:39.998117 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:24:40.070350 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070316 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c424d6e-0497-4717-a020-80361697c6d9-config-out\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070534 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070355 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c424d6e-0497-4717-a020-80361697c6d9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070534 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070382 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjgvw\" (UniqueName: \"kubernetes.io/projected/5c424d6e-0497-4717-a020-80361697c6d9-kube-api-access-fjgvw\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070534 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070500 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070721 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070554 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c424d6e-0497-4717-a020-80361697c6d9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070721 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070617 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070721 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070645 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-web-config\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070878 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070719 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070878 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070754 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070878 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.070878 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070851 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c424d6e-0497-4717-a020-80361697c6d9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.071067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070896 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5c424d6e-0497-4717-a020-80361697c6d9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.071067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.070919 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171602 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171658 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c424d6e-0497-4717-a020-80361697c6d9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171684 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171728 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-web-config\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:40.171746 2564 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:24:40.171823 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-main-tls podName:5c424d6e-0497-4717-a020-80361697c6d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:40.671801885 +0000 UTC m=+58.865996465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "5c424d6e-0497-4717-a020-80361697c6d9") : secret "alertmanager-main-tls" not found Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171755 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171897 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171942 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.171982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.171978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c424d6e-0497-4717-a020-80361697c6d9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.172302 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.172013 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5c424d6e-0497-4717-a020-80361697c6d9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.172302 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.172044 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.173856 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.172882 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c424d6e-0497-4717-a020-80361697c6d9-config-out\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.173856 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.172919 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c424d6e-0497-4717-a020-80361697c6d9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.173856 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.172949 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjgvw\" (UniqueName: \"kubernetes.io/projected/5c424d6e-0497-4717-a020-80361697c6d9-kube-api-access-fjgvw\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.173856 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.172975 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5c424d6e-0497-4717-a020-80361697c6d9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.173856 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.173758 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c424d6e-0497-4717-a020-80361697c6d9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.173856 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.173815 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c424d6e-0497-4717-a020-80361697c6d9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.175062 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.174997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.176988 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.176968 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c424d6e-0497-4717-a020-80361697c6d9-config-out\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.177086 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.177067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.177221 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.177202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.177543 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.177520 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.177543 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.177537 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-web-config\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.177662 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.177557 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.177662 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.177624 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c424d6e-0497-4717-a020-80361697c6d9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.183768 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.183749 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjgvw\" (UniqueName: \"kubernetes.io/projected/5c424d6e-0497-4717-a020-80361697c6d9-kube-api-access-fjgvw\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.597248 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.597211 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lsf9" event={"ID":"5322e235-8e02-4de9-8bd7-c1732a34c595","Type":"ContainerStarted","Data":"7b174156ff7289a9dce42bf06d1cb18f080443769df2169a0a743a9b7082c6af"} Apr 22 19:24:40.677006 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.676971 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.679664 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.679637 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5c424d6e-0497-4717-a020-80361697c6d9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5c424d6e-0497-4717-a020-80361697c6d9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.896094 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.896015 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:24:40.956547 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.956518 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-644d9fc5c6-272bn"] Apr 22 19:24:40.965649 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.965624 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.968549 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968524 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 19:24:40.968660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968524 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 19:24:40.968660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968579 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 19:24:40.968660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968524 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-b7k1dpf9e6782\"" Apr 22 19:24:40.968660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968525 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 19:24:40.968660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968525 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 19:24:40.968904 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.968533 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wfcrc\"" Apr 22 19:24:40.972770 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.972748 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-644d9fc5c6-272bn"] Apr 22 19:24:40.979464 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979436 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979598 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979581 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979649 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979615 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979716 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979646 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-grpc-tls\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979776 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979725 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b1e50bb-3dd3-46b1-a930-24324c91640e-metrics-client-ca\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979829 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979766 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979829 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-tls\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:40.979910 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:40.979825 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vp2c\" (UniqueName: \"kubernetes.io/projected/9b1e50bb-3dd3-46b1-a930-24324c91640e-kube-api-access-8vp2c\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.002868 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:41.002839 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c25b062_ea8e_4f67_b431_931fbd0173f4.slice/crio-ad8c5116ab094db3f7e3ae2a1c50cf9f3bd1bb0584949fb621127f703cfb2807 WatchSource:0}: Error finding container ad8c5116ab094db3f7e3ae2a1c50cf9f3bd1bb0584949fb621127f703cfb2807: Status 404 returned error can't find the container with id ad8c5116ab094db3f7e3ae2a1c50cf9f3bd1bb0584949fb621127f703cfb2807 Apr 22 19:24:41.081213 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081172 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081376 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081223 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-tls\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081376 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081275 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vp2c\" (UniqueName: \"kubernetes.io/projected/9b1e50bb-3dd3-46b1-a930-24324c91640e-kube-api-access-8vp2c\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081376 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081321 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081500 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081500 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081442 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081500 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081468 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-grpc-tls\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.081636 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.081498 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b1e50bb-3dd3-46b1-a930-24324c91640e-metrics-client-ca\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.082254 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.082206 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b1e50bb-3dd3-46b1-a930-24324c91640e-metrics-client-ca\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.084294 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.084249 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.084650 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.084627 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-grpc-tls\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.085373 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.085335 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.085688 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.085665 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.085996 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.085974 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.087306 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.087286 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9b1e50bb-3dd3-46b1-a930-24324c91640e-secret-thanos-querier-tls\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.089802 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.089760 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vp2c\" (UniqueName: \"kubernetes.io/projected/9b1e50bb-3dd3-46b1-a930-24324c91640e-kube-api-access-8vp2c\") pod \"thanos-querier-644d9fc5c6-272bn\" (UID: \"9b1e50bb-3dd3-46b1-a930-24324c91640e\") " pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.277480 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.277399 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:41.563181 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.563153 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:24:41.565148 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:41.565122 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c424d6e_0497_4717_a020_80361697c6d9.slice/crio-f9655964f4348a20bd92f7e45c95b95651873dd448a4c52caad5109d79e09e6f WatchSource:0}: Error finding container f9655964f4348a20bd92f7e45c95b95651873dd448a4c52caad5109d79e09e6f: Status 404 returned error can't find the container with id f9655964f4348a20bd92f7e45c95b95651873dd448a4c52caad5109d79e09e6f Apr 22 19:24:41.572913 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.572888 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-644d9fc5c6-272bn"] Apr 22 19:24:41.576211 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:41.576190 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1e50bb_3dd3_46b1_a930_24324c91640e.slice/crio-ff2779a3eba9d7b619cb039c946e7b0c409bf988c55f61dc9fa69bede5e18d8d WatchSource:0}: Error finding container ff2779a3eba9d7b619cb039c946e7b0c409bf988c55f61dc9fa69bede5e18d8d: Status 404 returned error can't find the container with id ff2779a3eba9d7b619cb039c946e7b0c409bf988c55f61dc9fa69bede5e18d8d Apr 22 19:24:41.601898 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.601867 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"ff2779a3eba9d7b619cb039c946e7b0c409bf988c55f61dc9fa69bede5e18d8d"} Apr 22 19:24:41.602979 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.602953 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" event={"ID":"5c25b062-ea8e-4f67-b431-931fbd0173f4","Type":"ContainerStarted","Data":"ad8c5116ab094db3f7e3ae2a1c50cf9f3bd1bb0584949fb621127f703cfb2807"} Apr 22 19:24:41.604166 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.604137 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78744977db-wmsfk" event={"ID":"bcf2c7d5-655c-413d-bdb2-cb1a7da45620","Type":"ContainerStarted","Data":"f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2"} Apr 22 19:24:41.606137 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.606115 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" event={"ID":"72dc47d5-c6f5-467f-ae4f-27ae71a19818","Type":"ContainerStarted","Data":"e19e57d0e85c47d9e7e605694e61a0a659667029bd35850c9844eb0dc0023b64"} Apr 22 19:24:41.607605 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.607586 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lsf9" event={"ID":"5322e235-8e02-4de9-8bd7-c1732a34c595","Type":"ContainerStarted","Data":"a5d061080f941268857492fe1c617e0feb6473904f4521e267f9afcd928e6256"} Apr 22 19:24:41.608461 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.608443 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"f9655964f4348a20bd92f7e45c95b95651873dd448a4c52caad5109d79e09e6f"} Apr 22 19:24:41.625134 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.625093 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78744977db-wmsfk" podStartSLOduration=1.489102259 podStartE2EDuration="4.625081481s" podCreationTimestamp="2026-04-22 19:24:37 +0000 UTC" firstStartedPulling="2026-04-22 19:24:38.282828956 +0000 UTC m=+56.477023521" lastFinishedPulling="2026-04-22 19:24:41.418808171 +0000 UTC m=+59.613002743" observedRunningTime="2026-04-22 19:24:41.624531542 +0000 UTC m=+59.818726153" watchObservedRunningTime="2026-04-22 19:24:41.625081481 +0000 UTC m=+59.819276067" Apr 22 19:24:41.640820 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:41.640776 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b6rv5" podStartSLOduration=1.7197639740000001 podStartE2EDuration="3.640763467s" podCreationTimestamp="2026-04-22 19:24:38 +0000 UTC" firstStartedPulling="2026-04-22 19:24:39.497805763 +0000 UTC m=+57.692000344" lastFinishedPulling="2026-04-22 19:24:41.418805266 +0000 UTC m=+59.612999837" observedRunningTime="2026-04-22 19:24:41.640240833 +0000 UTC m=+59.834435421" watchObservedRunningTime="2026-04-22 19:24:41.640763467 +0000 UTC m=+59.834958073" Apr 22 19:24:42.614142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:42.614102 2564 generic.go:358] "Generic (PLEG): container finished" podID="5322e235-8e02-4de9-8bd7-c1732a34c595" containerID="a5d061080f941268857492fe1c617e0feb6473904f4521e267f9afcd928e6256" exitCode=0 Apr 22 19:24:42.614616 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:42.614360 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lsf9" event={"ID":"5322e235-8e02-4de9-8bd7-c1732a34c595","Type":"ContainerDied","Data":"a5d061080f941268857492fe1c617e0feb6473904f4521e267f9afcd928e6256"} Apr 22 19:24:43.180527 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.180491 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7dd8c8d76-pknpl"] Apr 22 19:24:43.183989 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.183968 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.188245 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.188223 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:24:43.188524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.188504 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:24:43.188524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.188516 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:24:43.188905 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.188886 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-l8hk6\"" Apr 22 19:24:43.189901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.189880 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:24:43.189985 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.189882 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bsc79g1u22jem\"" Apr 22 19:24:43.199638 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.199588 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7dd8c8d76-pknpl"] Apr 22 19:24:43.200003 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.199949 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-metrics-server-audit-profiles\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.200082 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.200017 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwl6x\" (UniqueName: \"kubernetes.io/projected/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-kube-api-access-fwl6x\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.200082 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.200056 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.200190 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.200102 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-secret-metrics-server-client-certs\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.200240 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.200178 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-audit-log\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.200240 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.200220 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-client-ca-bundle\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.200331 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.200298 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-secret-metrics-server-tls\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301462 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301430 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-secret-metrics-server-client-certs\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301643 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301485 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-audit-log\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301643 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301510 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-client-ca-bundle\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301643 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-secret-metrics-server-tls\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301643 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301628 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-metrics-server-audit-profiles\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301902 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301662 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwl6x\" (UniqueName: \"kubernetes.io/projected/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-kube-api-access-fwl6x\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.301902 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301716 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.302001 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.301985 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-audit-log\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.302685 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.302654 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.302951 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.302928 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-metrics-server-audit-profiles\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.304417 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.304385 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-secret-metrics-server-client-certs\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.304586 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.304564 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-client-ca-bundle\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.304685 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.304627 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-secret-metrics-server-tls\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.310145 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.310123 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwl6x\" (UniqueName: \"kubernetes.io/projected/44ee1a4f-9243-4b44-8982-b2d0a6bb4431-kube-api-access-fwl6x\") pod \"metrics-server-7dd8c8d76-pknpl\" (UID: \"44ee1a4f-9243-4b44-8982-b2d0a6bb4431\") " pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.495939 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.495914 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:24:43.567129 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.567104 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dq5b5" Apr 22 19:24:43.621559 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.621520 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lsf9" event={"ID":"5322e235-8e02-4de9-8bd7-c1732a34c595","Type":"ContainerStarted","Data":"1bd694cb121a9de2b762812419db597f41fc89c7a6021b39baa8ecaa5a4ebec4"} Apr 22 19:24:43.621938 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.621569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9lsf9" event={"ID":"5322e235-8e02-4de9-8bd7-c1732a34c595","Type":"ContainerStarted","Data":"ce07342d8ffd11ee0b0f28e55a0ed3a4dbeab7d3d511a812faaa2dd32a9e65bd"} Apr 22 19:24:43.623138 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.623073 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c424d6e-0497-4717-a020-80361697c6d9" containerID="4a2759be844486134c2dcb563d839a23e1000db0ea132363a0fc87cc75480415" exitCode=0 Apr 22 19:24:43.623410 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.623346 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerDied","Data":"4a2759be844486134c2dcb563d839a23e1000db0ea132363a0fc87cc75480415"} Apr 22 19:24:43.625268 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.625222 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"7b2c3fb72b8c5e1a72fdf3eb1743566f16ed56e23aa356fdd41d8181225622c0"} Apr 22 19:24:43.625268 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.625247 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"e80d6c8785d35a45bbe122e3a64ff912dffb4b68043e865a8002d16e350cf5f5"} Apr 22 19:24:43.628437 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.628411 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" event={"ID":"5c25b062-ea8e-4f67-b431-931fbd0173f4","Type":"ContainerStarted","Data":"49dc5aea00825c275341a6a674b061daf97b21c31ad193848f301bdc27458b8c"} Apr 22 19:24:43.628528 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.628443 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" event={"ID":"5c25b062-ea8e-4f67-b431-931fbd0173f4","Type":"ContainerStarted","Data":"9ccd7f9dd7b98b7803147df2e0c8701a2fcaf7a7770547a26882326d6ad06655"} Apr 22 19:24:43.628528 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.628457 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" event={"ID":"5c25b062-ea8e-4f67-b431-931fbd0173f4","Type":"ContainerStarted","Data":"5bd9e41c9a14bc5c91c542f27e7a21f537048286a8e8dc6bea8907071b63d742"} Apr 22 19:24:43.645715 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.645157 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7dd8c8d76-pknpl"] Apr 22 19:24:43.650421 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:43.650396 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ee1a4f_9243_4b44_8982_b2d0a6bb4431.slice/crio-eeeef6d51e00bed846ef67bf014531b120801ee4245a9c287de6eb8fb59542c3 WatchSource:0}: Error finding container eeeef6d51e00bed846ef67bf014531b120801ee4245a9c287de6eb8fb59542c3: Status 404 returned error can't find the container with id eeeef6d51e00bed846ef67bf014531b120801ee4245a9c287de6eb8fb59542c3 Apr 22 19:24:43.671513 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.671229 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9lsf9" podStartSLOduration=4.073063674 podStartE2EDuration="5.671213317s" podCreationTimestamp="2026-04-22 19:24:38 +0000 UTC" firstStartedPulling="2026-04-22 19:24:39.874564369 +0000 UTC m=+58.068758938" lastFinishedPulling="2026-04-22 19:24:41.472714004 +0000 UTC m=+59.666908581" observedRunningTime="2026-04-22 19:24:43.646471168 +0000 UTC m=+61.840665766" watchObservedRunningTime="2026-04-22 19:24:43.671213317 +0000 UTC m=+61.865407950" Apr 22 19:24:43.672244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.672217 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq"] Apr 22 19:24:43.675424 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.675407 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:43.677380 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.677337 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-w5v7d" podStartSLOduration=3.936116526 podStartE2EDuration="5.677325512s" podCreationTimestamp="2026-04-22 19:24:38 +0000 UTC" firstStartedPulling="2026-04-22 19:24:41.004663852 +0000 UTC m=+59.198858418" lastFinishedPulling="2026-04-22 19:24:42.745872821 +0000 UTC m=+60.940067404" observedRunningTime="2026-04-22 19:24:43.676992311 +0000 UTC m=+61.871186918" watchObservedRunningTime="2026-04-22 19:24:43.677325512 +0000 UTC m=+61.871520098" Apr 22 19:24:43.678561 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.678546 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6zd94\"" Apr 22 19:24:43.678786 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.678771 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:24:43.682813 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.682794 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq"] Apr 22 19:24:43.705431 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.704916 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a25a1776-3de7-4264-8c9d-13d256c65549-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4s5bq\" (UID: \"a25a1776-3de7-4264-8c9d-13d256c65549\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:43.806769 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.806738 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a25a1776-3de7-4264-8c9d-13d256c65549-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4s5bq\" (UID: \"a25a1776-3de7-4264-8c9d-13d256c65549\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:43.808982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.808964 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a25a1776-3de7-4264-8c9d-13d256c65549-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4s5bq\" (UID: \"a25a1776-3de7-4264-8c9d-13d256c65549\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:43.985040 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.984959 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:43.991588 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.991562 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6ddbcb786b-49sn6"] Apr 22 19:24:43.995963 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.995946 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:43.998684 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.998653 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:24:43.998969 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.998654 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:24:43.998969 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.998731 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:24:43.998969 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.998773 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8t2zl\"" Apr 22 19:24:43.998969 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.998718 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:24:43.998969 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:43.998735 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:24:44.004460 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.004115 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:24:44.007260 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.007235 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6ddbcb786b-49sn6"] Apr 22 19:24:44.108768 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.108739 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-secret-telemeter-client\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.108924 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.108813 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.108924 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.108839 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-serving-certs-ca-bundle\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.109012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.108982 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.109012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.109014 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-metrics-client-ca\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.109012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.109029 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whl4k\" (UniqueName: \"kubernetes.io/projected/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-kube-api-access-whl4k\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.109207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.109070 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-telemeter-client-tls\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.109207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.109113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-federate-client-tls\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.122775 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.122738 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq"] Apr 22 19:24:44.210078 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210039 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210256 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210105 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-serving-certs-ca-bundle\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210256 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210175 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210256 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210202 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-metrics-client-ca\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210256 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210228 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whl4k\" (UniqueName: \"kubernetes.io/projected/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-kube-api-access-whl4k\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-telemeter-client-tls\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210303 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-federate-client-tls\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.210429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.210334 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-secret-telemeter-client\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.211040 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.211007 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.211133 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.211062 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-metrics-client-ca\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.211133 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.211089 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-serving-certs-ca-bundle\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.213620 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.213596 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-telemeter-client-tls\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.214116 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.214077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-secret-telemeter-client\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.214462 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.214439 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.214462 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.214453 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-federate-client-tls\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.218492 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.218467 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whl4k\" (UniqueName: \"kubernetes.io/projected/98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f-kube-api-access-whl4k\") pod \"telemeter-client-6ddbcb786b-49sn6\" (UID: \"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f\") " pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.307712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.307450 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" Apr 22 19:24:44.516551 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.516054 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6ddbcb786b-49sn6"] Apr 22 19:24:44.518583 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:44.518549 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98922dc3_030c_4d6b_ba9f_c7bdbc8dc55f.slice/crio-4397bf35b5a74d62981de6c77bdbeeab6d6e98bd9b892557b3c1c5220b0d6998 WatchSource:0}: Error finding container 4397bf35b5a74d62981de6c77bdbeeab6d6e98bd9b892557b3c1c5220b0d6998: Status 404 returned error can't find the container with id 4397bf35b5a74d62981de6c77bdbeeab6d6e98bd9b892557b3c1c5220b0d6998 Apr 22 19:24:44.637853 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.637776 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"c4f23f8a80b74713354cca7b3bd35503f3a795cc56e760d9ceff87cf60f392ec"} Apr 22 19:24:44.637853 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.637815 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"dcb3e07f706d9e4561c1e58afb3f10a35c00ffdc4620f82679cabf67b45ccd25"} Apr 22 19:24:44.637853 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.637830 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"edf608bb05957c094d4578c6b805690d34a476496df72ca92ea0048061e3ae4a"} Apr 22 19:24:44.640654 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.640595 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" event={"ID":"44ee1a4f-9243-4b44-8982-b2d0a6bb4431","Type":"ContainerStarted","Data":"eeeef6d51e00bed846ef67bf014531b120801ee4245a9c287de6eb8fb59542c3"} Apr 22 19:24:44.643361 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.643324 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" event={"ID":"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f","Type":"ContainerStarted","Data":"4397bf35b5a74d62981de6c77bdbeeab6d6e98bd9b892557b3c1c5220b0d6998"} Apr 22 19:24:44.646203 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:44.645946 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" event={"ID":"a25a1776-3de7-4264-8c9d-13d256c65549","Type":"ContainerStarted","Data":"daaa0cba61eca3a675959736e66fb0387a7a5300f2e06236e79556bc2eb13bed"} Apr 22 19:24:45.122215 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.122152 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:24:45.137386 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.137350 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.140833 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.140738 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:24:45.141330 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.141158 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:24:45.142188 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.141949 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:24:45.142188 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.141969 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:24:45.142364 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.142292 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:24:45.143426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.142294 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-s69dc\"" Apr 22 19:24:45.143426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.142450 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:24:45.143426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.142539 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:24:45.143426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.142825 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:24:45.143426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.142994 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7390n0328f8fb\"" Apr 22 19:24:45.143426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.143300 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:24:45.144089 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.143920 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:24:45.148884 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.147831 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:24:45.149937 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.149756 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:24:45.150287 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.150261 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:24:45.221421 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221386 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221561 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221435 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-config\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221561 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221454 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-config-out\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221561 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221512 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221561 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221529 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-web-config\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221561 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221556 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221582 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221603 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221624 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221647 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221666 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vttv7\" (UniqueName: \"kubernetes.io/projected/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-kube-api-access-vttv7\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221683 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221971 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221805 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221971 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221820 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221971 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221838 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221971 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221855 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.221971 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.221872 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323123 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323093 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-config\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323126 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-config-out\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323150 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323172 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-web-config\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323208 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323234 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323274 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323306 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323360 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vttv7\" (UniqueName: \"kubernetes.io/projected/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-kube-api-access-vttv7\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323420 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323495 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323519 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323579 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323604 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.323748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.323644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329262 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.326820 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329262 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.327266 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329262 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.328052 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329262 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.328101 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329262 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.328562 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-config-out\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329262 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.328721 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.329657 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.329476 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.332562 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.331925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.332562 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.332159 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.333337 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.333311 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.333431 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.333390 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.336712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.334893 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.338406 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.338383 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-config\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.338744 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.338724 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-web-config\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.341228 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.341178 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.341318 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.341260 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.341842 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.341563 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vttv7\" (UniqueName: \"kubernetes.io/projected/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-kube-api-access-vttv7\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.344924 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.344902 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a37fef64-58f3-4e78-8cf7-c2b0e4415b6a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.454252 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.454176 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:45.470088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.470055 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78744977db-wmsfk"] Apr 22 19:24:45.501569 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.501532 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77f6d6c664-tvq5q"] Apr 22 19:24:45.519579 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.519551 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77f6d6c664-tvq5q"] Apr 22 19:24:45.519743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.519653 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626052 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626009 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-oauth-config\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626226 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626075 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-serving-cert\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626226 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626108 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-service-ca\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626226 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-console-config\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626354 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626280 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-trusted-ca-bundle\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626354 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626307 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnsvp\" (UniqueName: \"kubernetes.io/projected/a8047f3d-60e4-44af-adf2-154e2630e40b-kube-api-access-wnsvp\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.626354 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.626341 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-oauth-serving-cert\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.652948 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.652905 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" event={"ID":"9b1e50bb-3dd3-46b1-a930-24324c91640e","Type":"ContainerStarted","Data":"5594c94540852fa5e208570f14eb92a928820a3c01ae47c8494f488494987ebf"} Apr 22 19:24:45.653379 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.653120 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:45.676133 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.676074 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" podStartSLOduration=2.843636981 podStartE2EDuration="5.676057056s" podCreationTimestamp="2026-04-22 19:24:40 +0000 UTC" firstStartedPulling="2026-04-22 19:24:41.578666841 +0000 UTC m=+59.772861409" lastFinishedPulling="2026-04-22 19:24:44.411086915 +0000 UTC m=+62.605281484" observedRunningTime="2026-04-22 19:24:45.674606183 +0000 UTC m=+63.868800786" watchObservedRunningTime="2026-04-22 19:24:45.676057056 +0000 UTC m=+63.870251645" Apr 22 19:24:45.727205 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727095 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-trusted-ca-bundle\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.727205 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727139 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnsvp\" (UniqueName: \"kubernetes.io/projected/a8047f3d-60e4-44af-adf2-154e2630e40b-kube-api-access-wnsvp\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.727417 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727216 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-oauth-serving-cert\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.727417 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727254 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-oauth-config\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.727417 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-serving-cert\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.728386 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727709 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-service-ca\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.728386 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.727808 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-console-config\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.728386 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.728238 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-trusted-ca-bundle\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.728386 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.728301 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-oauth-serving-cert\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.728664 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.728550 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-service-ca\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.728664 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.728573 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-console-config\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.729952 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.729914 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-oauth-config\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.730444 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.730422 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-serving-cert\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.736089 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.736067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnsvp\" (UniqueName: \"kubernetes.io/projected/a8047f3d-60e4-44af-adf2-154e2630e40b-kube-api-access-wnsvp\") pod \"console-77f6d6c664-tvq5q\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:45.831658 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:45.831625 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:46.772427 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:46.768596 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:24:46.777028 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:46.776902 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77f6d6c664-tvq5q"] Apr 22 19:24:46.781057 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:46.781016 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8047f3d_60e4_44af_adf2_154e2630e40b.slice/crio-3577feb92e707e6ed5c66122593cff67420550be0f79010b65ac67c81f35ebb9 WatchSource:0}: Error finding container 3577feb92e707e6ed5c66122593cff67420550be0f79010b65ac67c81f35ebb9: Status 404 returned error can't find the container with id 3577feb92e707e6ed5c66122593cff67420550be0f79010b65ac67c81f35ebb9 Apr 22 19:24:47.664467 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.664427 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"3252d3e16e7aa62702df2bd58363d0761ee06b62790739cbe70d9d67f1ac260d"} Apr 22 19:24:47.664467 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.664472 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"6e5f996a85da3d9cb2b8d5ee8d2dbaca15f3d794e4ea41f007816ac60dfd018c"} Apr 22 19:24:47.664720 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.664488 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"5cf765eb3a0192e5afd137f40fd4989ffd8c149541b0dc1964b0ed38dcfa983e"} Apr 22 19:24:47.664720 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.664500 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"a0472ecd498fcde2e6375813a1827fbc5a6e21227dbde93a668cc4e837ec87da"} Apr 22 19:24:47.664720 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.664512 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"ccca61d6a1a877bc2c03818717c1e225cffbaf94f5704ee36f970517f502d58b"} Apr 22 19:24:47.664720 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.664525 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5c424d6e-0497-4717-a020-80361697c6d9","Type":"ContainerStarted","Data":"6033b585209019a924b59ed25d2fa266196ec0974483f69672e0716803b60508"} Apr 22 19:24:47.665916 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.665863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" event={"ID":"a25a1776-3de7-4264-8c9d-13d256c65549","Type":"ContainerStarted","Data":"06122e6bde3e8c8441c0de551defc8f896ca6ffd7c3c7e1492a0d5f6c1572297"} Apr 22 19:24:47.666103 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.666077 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:47.667375 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.667354 2564 generic.go:358] "Generic (PLEG): container finished" podID="a37fef64-58f3-4e78-8cf7-c2b0e4415b6a" containerID="e8bd26921ad8685fc32a9359c880ea1ac2fab9ed8dd127832fefa1d58b7ad0ec" exitCode=0 Apr 22 19:24:47.667456 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.667436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerDied","Data":"e8bd26921ad8685fc32a9359c880ea1ac2fab9ed8dd127832fefa1d58b7ad0ec"} Apr 22 19:24:47.667512 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.667459 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"2c184ed6c2cbc0f786cb7359850c5909b04eb681dad345c52407a4ae967a5367"} Apr 22 19:24:47.669046 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.669024 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" event={"ID":"44ee1a4f-9243-4b44-8982-b2d0a6bb4431","Type":"ContainerStarted","Data":"94256808dff2ebc36f578e2fb64b4feab2a1021550a10ca89f7b910466a0ad29"} Apr 22 19:24:47.671200 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.671180 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f6d6c664-tvq5q" event={"ID":"a8047f3d-60e4-44af-adf2-154e2630e40b","Type":"ContainerStarted","Data":"9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c"} Apr 22 19:24:47.671279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.671204 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f6d6c664-tvq5q" event={"ID":"a8047f3d-60e4-44af-adf2-154e2630e40b","Type":"ContainerStarted","Data":"3577feb92e707e6ed5c66122593cff67420550be0f79010b65ac67c81f35ebb9"} Apr 22 19:24:47.672088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.672069 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" Apr 22 19:24:47.695606 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.695255 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.655786482 podStartE2EDuration="8.695239654s" podCreationTimestamp="2026-04-22 19:24:39 +0000 UTC" firstStartedPulling="2026-04-22 19:24:41.567689897 +0000 UTC m=+59.761884479" lastFinishedPulling="2026-04-22 19:24:46.607143065 +0000 UTC m=+64.801337651" observedRunningTime="2026-04-22 19:24:47.694110837 +0000 UTC m=+65.888305449" watchObservedRunningTime="2026-04-22 19:24:47.695239654 +0000 UTC m=+65.889434243" Apr 22 19:24:47.747019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.746949 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" podStartSLOduration=1.7895905619999999 podStartE2EDuration="4.746930009s" podCreationTimestamp="2026-04-22 19:24:43 +0000 UTC" firstStartedPulling="2026-04-22 19:24:43.652670757 +0000 UTC m=+61.846865332" lastFinishedPulling="2026-04-22 19:24:46.610010214 +0000 UTC m=+64.804204779" observedRunningTime="2026-04-22 19:24:47.745499043 +0000 UTC m=+65.939693630" watchObservedRunningTime="2026-04-22 19:24:47.746930009 +0000 UTC m=+65.941124597" Apr 22 19:24:47.770480 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.770427 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77f6d6c664-tvq5q" podStartSLOduration=2.770412855 podStartE2EDuration="2.770412855s" podCreationTimestamp="2026-04-22 19:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:47.770164167 +0000 UTC m=+65.964358755" watchObservedRunningTime="2026-04-22 19:24:47.770412855 +0000 UTC m=+65.964607441" Apr 22 19:24:47.788932 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:47.788883 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s5bq" podStartSLOduration=2.306608871 podStartE2EDuration="4.788868699s" podCreationTimestamp="2026-04-22 19:24:43 +0000 UTC" firstStartedPulling="2026-04-22 19:24:44.129016531 +0000 UTC m=+62.323211110" lastFinishedPulling="2026-04-22 19:24:46.611276359 +0000 UTC m=+64.805470938" observedRunningTime="2026-04-22 19:24:47.78763346 +0000 UTC m=+65.981828048" watchObservedRunningTime="2026-04-22 19:24:47.788868699 +0000 UTC m=+65.983063590" Apr 22 19:24:48.052545 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.052518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:48.055259 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.055242 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:48.065653 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.065632 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f6d5518-179f-4f70-8c2c-5b1b2a244e38-metrics-certs\") pod \"network-metrics-daemon-jblt6\" (UID: \"6f6d5518-179f-4f70-8c2c-5b1b2a244e38\") " pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:48.170876 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.170849 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:24:48.254177 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.254153 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:48.256729 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.256711 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:48.266844 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.266824 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:48.277469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.277451 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc67s\" (UniqueName: \"kubernetes.io/projected/95433104-c840-4e8f-a3ff-c645c636f399-kube-api-access-kc67s\") pod \"network-check-target-m4bzg\" (UID: \"95433104-c840-4e8f-a3ff-c645c636f399\") " pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:48.283562 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.283544 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqt8b\"" Apr 22 19:24:48.287648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.287631 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7c2z6\"" Apr 22 19:24:48.291773 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.291755 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jblt6" Apr 22 19:24:48.295671 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.295651 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:48.430786 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.430680 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m4bzg"] Apr 22 19:24:48.434136 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:48.434102 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95433104_c840_4e8f_a3ff_c645c636f399.slice/crio-498dea0a59c2d87262911317d508fbf0cb28a846f9dc6ad84f0a63a308edc8d1 WatchSource:0}: Error finding container 498dea0a59c2d87262911317d508fbf0cb28a846f9dc6ad84f0a63a308edc8d1: Status 404 returned error can't find the container with id 498dea0a59c2d87262911317d508fbf0cb28a846f9dc6ad84f0a63a308edc8d1 Apr 22 19:24:48.451759 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.451722 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jblt6"] Apr 22 19:24:48.454808 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:48.454784 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f6d5518_179f_4f70_8c2c_5b1b2a244e38.slice/crio-520bc9d59afcc5a0d20c3835afa81fd3347e75c292fb6b7ca46af62e64dac8dd WatchSource:0}: Error finding container 520bc9d59afcc5a0d20c3835afa81fd3347e75c292fb6b7ca46af62e64dac8dd: Status 404 returned error can't find the container with id 520bc9d59afcc5a0d20c3835afa81fd3347e75c292fb6b7ca46af62e64dac8dd Apr 22 19:24:48.680367 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.680280 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m4bzg" event={"ID":"95433104-c840-4e8f-a3ff-c645c636f399","Type":"ContainerStarted","Data":"498dea0a59c2d87262911317d508fbf0cb28a846f9dc6ad84f0a63a308edc8d1"} Apr 22 19:24:48.681491 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.681463 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jblt6" event={"ID":"6f6d5518-179f-4f70-8c2c-5b1b2a244e38","Type":"ContainerStarted","Data":"520bc9d59afcc5a0d20c3835afa81fd3347e75c292fb6b7ca46af62e64dac8dd"} Apr 22 19:24:48.683405 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.683333 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" event={"ID":"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f","Type":"ContainerStarted","Data":"023afd286a5db5e04c4a37797387221200535a0d3a766c902243c16e48a0aaa8"} Apr 22 19:24:48.683405 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.683367 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" event={"ID":"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f","Type":"ContainerStarted","Data":"29546d41addd4ed6d9be8e79f314c2db9df614ebb84c4bbcaa4900ab30ede8fe"} Apr 22 19:24:48.683567 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.683422 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" event={"ID":"98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f","Type":"ContainerStarted","Data":"420fbf33cdeb93b1d0cbf7046ff055f9dd1325801b8c5a4928281349e27276ec"} Apr 22 19:24:48.707897 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:48.707845 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6ddbcb786b-49sn6" podStartSLOduration=2.15707495 podStartE2EDuration="5.707830871s" podCreationTimestamp="2026-04-22 19:24:43 +0000 UTC" firstStartedPulling="2026-04-22 19:24:44.521813557 +0000 UTC m=+62.716008142" lastFinishedPulling="2026-04-22 19:24:48.072569484 +0000 UTC m=+66.266764063" observedRunningTime="2026-04-22 19:24:48.705629223 +0000 UTC m=+66.899823810" watchObservedRunningTime="2026-04-22 19:24:48.707830871 +0000 UTC m=+66.902025458" Apr 22 19:24:49.322223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.322144 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77f6d6c664-tvq5q"] Apr 22 19:24:49.352832 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.352795 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-756cd94657-px7j9"] Apr 22 19:24:49.359474 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.359444 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.364504 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.364458 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756cd94657-px7j9"] Apr 22 19:24:49.468308 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468228 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-service-ca\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.468522 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468392 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-console-config\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.468522 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468461 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-serving-cert\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.468522 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468486 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-oauth-serving-cert\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.468522 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468517 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-oauth-config\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.468712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468545 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-trusted-ca-bundle\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.468712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.468599 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47c4n\" (UniqueName: \"kubernetes.io/projected/211de526-15ee-4db1-85e0-db1a6832e9e2-kube-api-access-47c4n\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569471 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569436 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-service-ca\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569657 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569598 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-console-config\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569657 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569646 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-serving-cert\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569801 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569664 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-oauth-serving-cert\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569801 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569709 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-oauth-config\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569801 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569741 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-trusted-ca-bundle\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.569945 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.569799 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47c4n\" (UniqueName: \"kubernetes.io/projected/211de526-15ee-4db1-85e0-db1a6832e9e2-kube-api-access-47c4n\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.570091 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.570065 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-service-ca\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.570710 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.570666 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-oauth-serving-cert\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.571019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.570967 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-console-config\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.571019 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.570967 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-trusted-ca-bundle\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.573373 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.573286 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-oauth-config\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.573373 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.573286 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-serving-cert\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.579123 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.579081 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47c4n\" (UniqueName: \"kubernetes.io/projected/211de526-15ee-4db1-85e0-db1a6832e9e2-kube-api-access-47c4n\") pod \"console-756cd94657-px7j9\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:49.675948 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:49.675913 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:50.691968 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:50.691932 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jblt6" event={"ID":"6f6d5518-179f-4f70-8c2c-5b1b2a244e38","Type":"ContainerStarted","Data":"90438fae33f971dd06508e6d40878ca82839cfd08fee646ac8722b21e9089d8f"} Apr 22 19:24:51.664334 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:51.664301 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-644d9fc5c6-272bn" Apr 22 19:24:51.887353 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:51.887289 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756cd94657-px7j9"] Apr 22 19:24:51.890294 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:24:51.890267 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211de526_15ee_4db1_85e0_db1a6832e9e2.slice/crio-65c5b7830e330e30153394f349b9e71d87c48301d3043d72f0abeed92f19b08b WatchSource:0}: Error finding container 65c5b7830e330e30153394f349b9e71d87c48301d3043d72f0abeed92f19b08b: Status 404 returned error can't find the container with id 65c5b7830e330e30153394f349b9e71d87c48301d3043d72f0abeed92f19b08b Apr 22 19:24:52.700289 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.700248 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756cd94657-px7j9" event={"ID":"211de526-15ee-4db1-85e0-db1a6832e9e2","Type":"ContainerStarted","Data":"496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8"} Apr 22 19:24:52.700289 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.700291 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756cd94657-px7j9" event={"ID":"211de526-15ee-4db1-85e0-db1a6832e9e2","Type":"ContainerStarted","Data":"65c5b7830e330e30153394f349b9e71d87c48301d3043d72f0abeed92f19b08b"} Apr 22 19:24:52.703285 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.703247 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"3f609cf80f5dd7d2e86aad04143facc5b41e10db26837bff349c2281ca6fcbcd"} Apr 22 19:24:52.703285 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.703288 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"96f4a2152ad70fd8b6fed1a383b4f0936ef6496ca1949139517260f41a07c788"} Apr 22 19:24:52.703466 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.703302 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"36305ec3450f04baec30b32da2a01b4a6b0bce2d66eec6ebb85129cd2c858d08"} Apr 22 19:24:52.703466 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.703315 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"c43ba6a0951335b7babd02449be1f47b44db43e13a18ed00599783e64c5a22fc"} Apr 22 19:24:52.703466 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.703326 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"f5d0a4ab0a970e6146705506ee79c6d20107e9b423b613d60e1159a33a14a854"} Apr 22 19:24:52.703466 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.703336 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a37fef64-58f3-4e78-8cf7-c2b0e4415b6a","Type":"ContainerStarted","Data":"7d4fc3c8204fe481147cc4e0ee44dcca0e0059e866f80873299d9171f2ea92a0"} Apr 22 19:24:52.704592 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.704566 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m4bzg" event={"ID":"95433104-c840-4e8f-a3ff-c645c636f399","Type":"ContainerStarted","Data":"5211b49131e84f4a6a5e5ddc06c2f7f42e15d253c2f3893d78b6dde35401e342"} Apr 22 19:24:52.704717 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.704681 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:24:52.706246 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.706224 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jblt6" event={"ID":"6f6d5518-179f-4f70-8c2c-5b1b2a244e38","Type":"ContainerStarted","Data":"27d2949d3b152e5671ae1da4cc0081801865a44ad68fad6eb0a2b512bbf168a1"} Apr 22 19:24:52.719157 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.719101 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-756cd94657-px7j9" podStartSLOduration=3.7190699780000003 podStartE2EDuration="3.719069978s" podCreationTimestamp="2026-04-22 19:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:52.717950434 +0000 UTC m=+70.912145021" watchObservedRunningTime="2026-04-22 19:24:52.719069978 +0000 UTC m=+70.913264566" Apr 22 19:24:52.736168 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.736110 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jblt6" podStartSLOduration=69.588076902 podStartE2EDuration="1m10.736091695s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:24:48.457389342 +0000 UTC m=+66.651583909" lastFinishedPulling="2026-04-22 19:24:49.605404136 +0000 UTC m=+67.799598702" observedRunningTime="2026-04-22 19:24:52.734338172 +0000 UTC m=+70.928532760" watchObservedRunningTime="2026-04-22 19:24:52.736091695 +0000 UTC m=+70.930286285" Apr 22 19:24:52.761390 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.761329 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.677467513 podStartE2EDuration="7.761307169s" podCreationTimestamp="2026-04-22 19:24:45 +0000 UTC" firstStartedPulling="2026-04-22 19:24:47.668974303 +0000 UTC m=+65.863168878" lastFinishedPulling="2026-04-22 19:24:51.752813955 +0000 UTC m=+69.947008534" observedRunningTime="2026-04-22 19:24:52.759339454 +0000 UTC m=+70.953534056" watchObservedRunningTime="2026-04-22 19:24:52.761307169 +0000 UTC m=+70.955501757" Apr 22 19:24:52.774916 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:52.774871 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m4bzg" podStartSLOduration=67.449005613 podStartE2EDuration="1m10.774856072s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:24:48.436356636 +0000 UTC m=+66.630551202" lastFinishedPulling="2026-04-22 19:24:51.762207081 +0000 UTC m=+69.956401661" observedRunningTime="2026-04-22 19:24:52.774830284 +0000 UTC m=+70.969024872" watchObservedRunningTime="2026-04-22 19:24:52.774856072 +0000 UTC m=+70.969050660" Apr 22 19:24:55.455185 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:55.455147 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:24:55.832039 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:55.831985 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:24:59.676510 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:59.676482 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:59.676900 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:59.676820 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:59.681608 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:59.681583 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:24:59.736171 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:24:59.736148 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:25:03.496847 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:03.496731 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:25:03.496847 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:03.496773 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:25:10.493111 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.493049 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78744977db-wmsfk" podUID="bcf2c7d5-655c-413d-bdb2-cb1a7da45620" containerName="console" containerID="cri-o://f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2" gracePeriod=15 Apr 22 19:25:10.732517 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.732494 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78744977db-wmsfk_bcf2c7d5-655c-413d-bdb2-cb1a7da45620/console/0.log" Apr 22 19:25:10.732655 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.732578 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:25:10.766901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.766820 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78744977db-wmsfk_bcf2c7d5-655c-413d-bdb2-cb1a7da45620/console/0.log" Apr 22 19:25:10.766901 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.766870 2564 generic.go:358] "Generic (PLEG): container finished" podID="bcf2c7d5-655c-413d-bdb2-cb1a7da45620" containerID="f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2" exitCode=2 Apr 22 19:25:10.767118 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.766940 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78744977db-wmsfk" Apr 22 19:25:10.767118 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.766946 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78744977db-wmsfk" event={"ID":"bcf2c7d5-655c-413d-bdb2-cb1a7da45620","Type":"ContainerDied","Data":"f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2"} Apr 22 19:25:10.767118 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.766987 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78744977db-wmsfk" event={"ID":"bcf2c7d5-655c-413d-bdb2-cb1a7da45620","Type":"ContainerDied","Data":"6c66752aa7257f19f10dedab1de0061c9cdefdc339ce27779f77fa533f931c0b"} Apr 22 19:25:10.767118 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.767007 2564 scope.go:117] "RemoveContainer" containerID="f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2" Apr 22 19:25:10.776518 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.776499 2564 scope.go:117] "RemoveContainer" containerID="f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2" Apr 22 19:25:10.776877 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:25:10.776849 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2\": container with ID starting with f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2 not found: ID does not exist" containerID="f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2" Apr 22 19:25:10.776970 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.776890 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2"} err="failed to get container status \"f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2\": rpc error: code = NotFound desc = could not find container \"f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2\": container with ID starting with f58841bbbeb0fdf325101a5afeca466314c173d8632e2b1347ca40003ea3b2d2 not found: ID does not exist" Apr 22 19:25:10.872361 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872328 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-oauth-config\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.872506 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872394 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-service-ca\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.872506 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872418 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-trusted-ca-bundle\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.872711 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872673 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-oauth-serving-cert\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.872792 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872760 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-config\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.872792 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872784 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:10.872895 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872801 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-serving-cert\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.872895 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872799 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-service-ca" (OuterVolumeSpecName: "service-ca") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:10.872895 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.872872 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw95f\" (UniqueName: \"kubernetes.io/projected/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-kube-api-access-fw95f\") pod \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\" (UID: \"bcf2c7d5-655c-413d-bdb2-cb1a7da45620\") " Apr 22 19:25:10.873070 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.873040 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:10.873196 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.873093 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-config" (OuterVolumeSpecName: "console-config") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:10.873196 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.873159 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-service-ca\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:10.873196 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.873174 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-trusted-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:10.873196 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.873184 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-oauth-serving-cert\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:10.874672 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.874649 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:10.874862 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.874834 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:10.874907 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.874844 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-kube-api-access-fw95f" (OuterVolumeSpecName: "kube-api-access-fw95f") pod "bcf2c7d5-655c-413d-bdb2-cb1a7da45620" (UID: "bcf2c7d5-655c-413d-bdb2-cb1a7da45620"). InnerVolumeSpecName "kube-api-access-fw95f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:10.974428 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.974395 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-serving-cert\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:10.974428 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.974424 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fw95f\" (UniqueName: \"kubernetes.io/projected/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-kube-api-access-fw95f\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:10.974428 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.974433 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-oauth-config\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:10.974640 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:10.974443 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bcf2c7d5-655c-413d-bdb2-cb1a7da45620-console-config\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:11.088669 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:11.088641 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78744977db-wmsfk"] Apr 22 19:25:11.097988 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:11.097962 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78744977db-wmsfk"] Apr 22 19:25:12.367847 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:12.367813 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf2c7d5-655c-413d-bdb2-cb1a7da45620" path="/var/lib/kubelet/pods/bcf2c7d5-655c-413d-bdb2-cb1a7da45620/volumes" Apr 22 19:25:14.715068 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:14.715030 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77f6d6c664-tvq5q" podUID="a8047f3d-60e4-44af-adf2-154e2630e40b" containerName="console" containerID="cri-o://9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c" gracePeriod=15 Apr 22 19:25:14.963774 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:14.963747 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f6d6c664-tvq5q_a8047f3d-60e4-44af-adf2-154e2630e40b/console/0.log" Apr 22 19:25:14.963903 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:14.963824 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:25:15.109809 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109780 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-serving-cert\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.109991 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109862 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnsvp\" (UniqueName: \"kubernetes.io/projected/a8047f3d-60e4-44af-adf2-154e2630e40b-kube-api-access-wnsvp\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.109991 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109883 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-service-ca\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.109991 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109900 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-trusted-ca-bundle\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.109991 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109928 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-console-config\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.109991 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109958 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-oauth-serving-cert\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.110221 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.109998 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-oauth-config\") pod \"a8047f3d-60e4-44af-adf2-154e2630e40b\" (UID: \"a8047f3d-60e4-44af-adf2-154e2630e40b\") " Apr 22 19:25:15.110375 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.110353 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-service-ca" (OuterVolumeSpecName: "service-ca") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:15.110683 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.110636 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:15.110818 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.110688 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-console-config" (OuterVolumeSpecName: "console-config") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:15.110818 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.110743 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:15.111998 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.111976 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:15.112453 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.112436 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8047f3d-60e4-44af-adf2-154e2630e40b-kube-api-access-wnsvp" (OuterVolumeSpecName: "kube-api-access-wnsvp") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "kube-api-access-wnsvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:15.112453 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.112439 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a8047f3d-60e4-44af-adf2-154e2630e40b" (UID: "a8047f3d-60e4-44af-adf2-154e2630e40b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:15.211166 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211135 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wnsvp\" (UniqueName: \"kubernetes.io/projected/a8047f3d-60e4-44af-adf2-154e2630e40b-kube-api-access-wnsvp\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.211166 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211162 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-service-ca\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.211166 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211171 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-trusted-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.211393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211180 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-console-config\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.211393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211190 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8047f3d-60e4-44af-adf2-154e2630e40b-oauth-serving-cert\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.211393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211199 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-oauth-config\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.211393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.211208 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8047f3d-60e4-44af-adf2-154e2630e40b-console-serving-cert\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:25:15.783347 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.783321 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f6d6c664-tvq5q_a8047f3d-60e4-44af-adf2-154e2630e40b/console/0.log" Apr 22 19:25:15.783743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.783358 2564 generic.go:358] "Generic (PLEG): container finished" podID="a8047f3d-60e4-44af-adf2-154e2630e40b" containerID="9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c" exitCode=2 Apr 22 19:25:15.783743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.783419 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f6d6c664-tvq5q" event={"ID":"a8047f3d-60e4-44af-adf2-154e2630e40b","Type":"ContainerDied","Data":"9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c"} Apr 22 19:25:15.783743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.783423 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f6d6c664-tvq5q" Apr 22 19:25:15.783743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.783446 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f6d6c664-tvq5q" event={"ID":"a8047f3d-60e4-44af-adf2-154e2630e40b","Type":"ContainerDied","Data":"3577feb92e707e6ed5c66122593cff67420550be0f79010b65ac67c81f35ebb9"} Apr 22 19:25:15.783743 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.783460 2564 scope.go:117] "RemoveContainer" containerID="9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c" Apr 22 19:25:15.793429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.792519 2564 scope.go:117] "RemoveContainer" containerID="9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c" Apr 22 19:25:15.793429 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:25:15.793377 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c\": container with ID starting with 9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c not found: ID does not exist" containerID="9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c" Apr 22 19:25:15.793429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.793410 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c"} err="failed to get container status \"9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c\": rpc error: code = NotFound desc = could not find container \"9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c\": container with ID starting with 9898c8fa82f5f9c160c30108e3fe6c6c24e8e5e33d56005e71e878a1f5b9184c not found: ID does not exist" Apr 22 19:25:15.806441 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.806415 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77f6d6c664-tvq5q"] Apr 22 19:25:15.809374 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:15.809352 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77f6d6c664-tvq5q"] Apr 22 19:25:16.367137 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:16.367103 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8047f3d-60e4-44af-adf2-154e2630e40b" path="/var/lib/kubelet/pods/a8047f3d-60e4-44af-adf2-154e2630e40b/volumes" Apr 22 19:25:23.502167 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:23.502140 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:25:23.506041 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:23.506017 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7dd8c8d76-pknpl" Apr 22 19:25:23.712711 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:23.712672 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m4bzg" Apr 22 19:25:45.454871 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:45.454829 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:25:45.476458 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:45.476430 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:25:45.889661 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:25:45.889637 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:26:13.044456 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:13.044429 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756cd94657-px7j9"] Apr 22 19:26:38.062476 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.062418 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-756cd94657-px7j9" podUID="211de526-15ee-4db1-85e0-db1a6832e9e2" containerName="console" containerID="cri-o://496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8" gracePeriod=15 Apr 22 19:26:38.297893 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.297870 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756cd94657-px7j9_211de526-15ee-4db1-85e0-db1a6832e9e2/console/0.log" Apr 22 19:26:38.298012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.297940 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:26:38.423223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423131 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-oauth-serving-cert\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423170 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-oauth-config\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423203 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-serving-cert\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423223 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423223 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-service-ca\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423538 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423277 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-console-config\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423538 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423307 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-trusted-ca-bundle\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423538 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423341 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47c4n\" (UniqueName: \"kubernetes.io/projected/211de526-15ee-4db1-85e0-db1a6832e9e2-kube-api-access-47c4n\") pod \"211de526-15ee-4db1-85e0-db1a6832e9e2\" (UID: \"211de526-15ee-4db1-85e0-db1a6832e9e2\") " Apr 22 19:26:38.423712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423672 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:38.423795 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423775 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-console-config" (OuterVolumeSpecName: "console-config") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:38.423852 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423830 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:38.423898 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.423884 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-service-ca" (OuterVolumeSpecName: "service-ca") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:38.425575 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.425540 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:38.425667 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.425575 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211de526-15ee-4db1-85e0-db1a6832e9e2-kube-api-access-47c4n" (OuterVolumeSpecName: "kube-api-access-47c4n") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "kube-api-access-47c4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:38.425667 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.425637 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "211de526-15ee-4db1-85e0-db1a6832e9e2" (UID: "211de526-15ee-4db1-85e0-db1a6832e9e2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:38.524582 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524547 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-console-config\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:38.524582 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524576 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-trusted-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:38.524582 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524585 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47c4n\" (UniqueName: \"kubernetes.io/projected/211de526-15ee-4db1-85e0-db1a6832e9e2-kube-api-access-47c4n\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:38.524827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524595 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-oauth-serving-cert\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:38.524827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524604 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-oauth-config\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:38.524827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524613 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/211de526-15ee-4db1-85e0-db1a6832e9e2-console-serving-cert\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:38.524827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:38.524621 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/211de526-15ee-4db1-85e0-db1a6832e9e2-service-ca\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:26:39.023017 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.022991 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756cd94657-px7j9_211de526-15ee-4db1-85e0-db1a6832e9e2/console/0.log" Apr 22 19:26:39.023180 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.023027 2564 generic.go:358] "Generic (PLEG): container finished" podID="211de526-15ee-4db1-85e0-db1a6832e9e2" containerID="496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8" exitCode=2 Apr 22 19:26:39.023180 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.023075 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756cd94657-px7j9" event={"ID":"211de526-15ee-4db1-85e0-db1a6832e9e2","Type":"ContainerDied","Data":"496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8"} Apr 22 19:26:39.023180 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.023092 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756cd94657-px7j9" Apr 22 19:26:39.023180 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.023108 2564 scope.go:117] "RemoveContainer" containerID="496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8" Apr 22 19:26:39.023180 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.023098 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756cd94657-px7j9" event={"ID":"211de526-15ee-4db1-85e0-db1a6832e9e2","Type":"ContainerDied","Data":"65c5b7830e330e30153394f349b9e71d87c48301d3043d72f0abeed92f19b08b"} Apr 22 19:26:39.031549 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.031532 2564 scope.go:117] "RemoveContainer" containerID="496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8" Apr 22 19:26:39.031844 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:26:39.031823 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8\": container with ID starting with 496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8 not found: ID does not exist" containerID="496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8" Apr 22 19:26:39.031906 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.031851 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8"} err="failed to get container status \"496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8\": rpc error: code = NotFound desc = could not find container \"496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8\": container with ID starting with 496ceda4d470611f2c826e197b5b02c71d6ba7d91075cbdc6d06620621d462c8 not found: ID does not exist" Apr 22 19:26:39.045095 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.045071 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756cd94657-px7j9"] Apr 22 19:26:39.049014 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:39.048993 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-756cd94657-px7j9"] Apr 22 19:26:40.367461 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.367429 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211de526-15ee-4db1-85e0-db1a6832e9e2" path="/var/lib/kubelet/pods/211de526-15ee-4db1-85e0-db1a6832e9e2/volumes" Apr 22 19:26:40.698036 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.697962 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wkh5z"] Apr 22 19:26:40.698295 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698283 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="211de526-15ee-4db1-85e0-db1a6832e9e2" containerName="console" Apr 22 19:26:40.698336 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698298 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="211de526-15ee-4db1-85e0-db1a6832e9e2" containerName="console" Apr 22 19:26:40.698336 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698321 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8047f3d-60e4-44af-adf2-154e2630e40b" containerName="console" Apr 22 19:26:40.698336 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698326 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8047f3d-60e4-44af-adf2-154e2630e40b" containerName="console" Apr 22 19:26:40.698420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698337 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcf2c7d5-655c-413d-bdb2-cb1a7da45620" containerName="console" Apr 22 19:26:40.698420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698342 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf2c7d5-655c-413d-bdb2-cb1a7da45620" containerName="console" Apr 22 19:26:40.698420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698392 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="211de526-15ee-4db1-85e0-db1a6832e9e2" containerName="console" Apr 22 19:26:40.698420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698400 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8047f3d-60e4-44af-adf2-154e2630e40b" containerName="console" Apr 22 19:26:40.698420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.698409 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcf2c7d5-655c-413d-bdb2-cb1a7da45620" containerName="console" Apr 22 19:26:40.702808 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.702792 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.705674 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.705650 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:26:40.710090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.710065 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wkh5z"] Apr 22 19:26:40.845242 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.845209 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-dbus\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.845426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.845256 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-original-pull-secret\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.845426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.845384 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-kubelet-config\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.946457 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.946425 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-kubelet-config\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.946625 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.946494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-dbus\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.946625 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.946524 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-original-pull-secret\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.946625 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.946545 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-kubelet-config\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.946803 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.946731 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-dbus\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:40.948773 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:40.948722 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c34d1177-43b6-4961-8baf-9e9e12cd0ee6-original-pull-secret\") pod \"global-pull-secret-syncer-wkh5z\" (UID: \"c34d1177-43b6-4961-8baf-9e9e12cd0ee6\") " pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:41.011758 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:41.011735 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wkh5z" Apr 22 19:26:41.137794 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:41.137620 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wkh5z"] Apr 22 19:26:41.140218 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:26:41.140185 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc34d1177_43b6_4961_8baf_9e9e12cd0ee6.slice/crio-8a30a69a421f2d6cf9fe532158862655e60c04c08c55bce9b46912ccf2a2f399 WatchSource:0}: Error finding container 8a30a69a421f2d6cf9fe532158862655e60c04c08c55bce9b46912ccf2a2f399: Status 404 returned error can't find the container with id 8a30a69a421f2d6cf9fe532158862655e60c04c08c55bce9b46912ccf2a2f399 Apr 22 19:26:42.036748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:42.036707 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wkh5z" event={"ID":"c34d1177-43b6-4961-8baf-9e9e12cd0ee6","Type":"ContainerStarted","Data":"8a30a69a421f2d6cf9fe532158862655e60c04c08c55bce9b46912ccf2a2f399"} Apr 22 19:26:45.047270 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:45.047236 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wkh5z" event={"ID":"c34d1177-43b6-4961-8baf-9e9e12cd0ee6","Type":"ContainerStarted","Data":"28bf659f127419f160519d739015b2201e5c37e0415259098696df520211e8fa"} Apr 22 19:26:45.080610 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:26:45.077214 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wkh5z" podStartSLOduration=1.400813644 podStartE2EDuration="5.077196737s" podCreationTimestamp="2026-04-22 19:26:40 +0000 UTC" firstStartedPulling="2026-04-22 19:26:41.141779069 +0000 UTC m=+179.335973634" lastFinishedPulling="2026-04-22 19:26:44.818162157 +0000 UTC m=+183.012356727" observedRunningTime="2026-04-22 19:26:45.074909722 +0000 UTC m=+183.269104309" watchObservedRunningTime="2026-04-22 19:26:45.077196737 +0000 UTC m=+183.271391325" Apr 22 19:28:42.256447 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:28:42.256417 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:28:42.256925 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:28:42.256676 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:28:42.262425 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:28:42.262410 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:29:58.350249 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.350212 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5899c94d75-hqjsz"] Apr 22 19:29:58.353485 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.353468 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.357244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357219 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:29:58.357385 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357252 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:29:58.357459 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357390 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:29:58.357524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357475 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:29:58.357524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357484 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:29:58.357524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357511 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-skskp\"" Apr 22 19:29:58.357667 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357529 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:29:58.357826 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.357808 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:29:58.361157 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.361135 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:29:58.367618 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.367597 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5899c94d75-hqjsz"] Apr 22 19:29:58.483384 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483348 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-config\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.483384 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483389 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-oauth-serving-cert\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.483619 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483458 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz265\" (UniqueName: \"kubernetes.io/projected/0f58728b-eff1-4516-bc7f-9544f7faf5b2-kube-api-access-gz265\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.483619 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483500 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-oauth-config\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.483619 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483573 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-serving-cert\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.483619 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483609 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-service-ca\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.483828 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.483665 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-trusted-ca-bundle\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584227 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-oauth-config\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584253 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-serving-cert\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584275 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-service-ca\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584296 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-trusted-ca-bundle\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584329 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-config\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584352 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-oauth-serving-cert\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.584418 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.584393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz265\" (UniqueName: \"kubernetes.io/projected/0f58728b-eff1-4516-bc7f-9544f7faf5b2-kube-api-access-gz265\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.585116 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.585092 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-config\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.585216 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.585154 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-oauth-serving-cert\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.585216 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.585154 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-service-ca\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.585216 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.585196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f58728b-eff1-4516-bc7f-9544f7faf5b2-trusted-ca-bundle\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.586648 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.586629 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-serving-cert\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.586759 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.586634 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f58728b-eff1-4516-bc7f-9544f7faf5b2-console-oauth-config\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.592617 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.592591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz265\" (UniqueName: \"kubernetes.io/projected/0f58728b-eff1-4516-bc7f-9544f7faf5b2-kube-api-access-gz265\") pod \"console-5899c94d75-hqjsz\" (UID: \"0f58728b-eff1-4516-bc7f-9544f7faf5b2\") " pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.662735 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.662630 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:29:58.774880 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.774852 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5899c94d75-hqjsz"] Apr 22 19:29:58.777745 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:29:58.777718 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f58728b_eff1_4516_bc7f_9544f7faf5b2.slice/crio-63567c12fd47916f002ae64ed86fd517ad8e3ca20708ab5bd65fca8fe4e3f3dd WatchSource:0}: Error finding container 63567c12fd47916f002ae64ed86fd517ad8e3ca20708ab5bd65fca8fe4e3f3dd: Status 404 returned error can't find the container with id 63567c12fd47916f002ae64ed86fd517ad8e3ca20708ab5bd65fca8fe4e3f3dd Apr 22 19:29:58.779472 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:58.779456 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:29:59.595628 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:59.595590 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5899c94d75-hqjsz" event={"ID":"0f58728b-eff1-4516-bc7f-9544f7faf5b2","Type":"ContainerStarted","Data":"7dbb8b84cc955cef85d6c4be700d50ca5958fd3aebfd790975ca2ccce9950c25"} Apr 22 19:29:59.595628 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:59.595627 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5899c94d75-hqjsz" event={"ID":"0f58728b-eff1-4516-bc7f-9544f7faf5b2","Type":"ContainerStarted","Data":"63567c12fd47916f002ae64ed86fd517ad8e3ca20708ab5bd65fca8fe4e3f3dd"} Apr 22 19:29:59.619991 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:29:59.619943 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5899c94d75-hqjsz" podStartSLOduration=1.619926633 podStartE2EDuration="1.619926633s" podCreationTimestamp="2026-04-22 19:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:29:59.618833872 +0000 UTC m=+377.813028460" watchObservedRunningTime="2026-04-22 19:29:59.619926633 +0000 UTC m=+377.814121218" Apr 22 19:30:08.663850 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:30:08.663796 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:30:08.663850 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:30:08.663855 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:30:08.668398 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:30:08.668371 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:30:09.628530 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:30:09.628505 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5899c94d75-hqjsz" Apr 22 19:31:07.717652 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.717570 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-nzqdk"] Apr 22 19:31:07.720955 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.720935 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nzqdk" Apr 22 19:31:07.724239 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.724215 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:31:07.725447 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.725433 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6hbj8\"" Apr 22 19:31:07.725506 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.725434 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:31:07.725506 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.725464 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:31:07.728851 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.728828 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nzqdk"] Apr 22 19:31:07.775715 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.775655 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xs8b\" (UniqueName: \"kubernetes.io/projected/84048b42-efc8-4ee4-a171-ff4627d9d2e7-kube-api-access-9xs8b\") pod \"s3-init-nzqdk\" (UID: \"84048b42-efc8-4ee4-a171-ff4627d9d2e7\") " pod="kserve/s3-init-nzqdk" Apr 22 19:31:07.876351 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.876316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xs8b\" (UniqueName: \"kubernetes.io/projected/84048b42-efc8-4ee4-a171-ff4627d9d2e7-kube-api-access-9xs8b\") pod \"s3-init-nzqdk\" (UID: \"84048b42-efc8-4ee4-a171-ff4627d9d2e7\") " pod="kserve/s3-init-nzqdk" Apr 22 19:31:07.888916 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:07.888890 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xs8b\" (UniqueName: \"kubernetes.io/projected/84048b42-efc8-4ee4-a171-ff4627d9d2e7-kube-api-access-9xs8b\") pod \"s3-init-nzqdk\" (UID: \"84048b42-efc8-4ee4-a171-ff4627d9d2e7\") " pod="kserve/s3-init-nzqdk" Apr 22 19:31:08.044191 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:08.044095 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nzqdk" Apr 22 19:31:08.160712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:08.160662 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nzqdk"] Apr 22 19:31:08.163200 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:31:08.163173 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84048b42_efc8_4ee4_a171_ff4627d9d2e7.slice/crio-d8fe3aec73fb67b61678bfab7dd7e0beb2f5de342f12dda74e6b352589530c82 WatchSource:0}: Error finding container d8fe3aec73fb67b61678bfab7dd7e0beb2f5de342f12dda74e6b352589530c82: Status 404 returned error can't find the container with id d8fe3aec73fb67b61678bfab7dd7e0beb2f5de342f12dda74e6b352589530c82 Apr 22 19:31:08.800686 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:08.800645 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nzqdk" event={"ID":"84048b42-efc8-4ee4-a171-ff4627d9d2e7","Type":"ContainerStarted","Data":"d8fe3aec73fb67b61678bfab7dd7e0beb2f5de342f12dda74e6b352589530c82"} Apr 22 19:31:12.813995 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:12.813962 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nzqdk" event={"ID":"84048b42-efc8-4ee4-a171-ff4627d9d2e7","Type":"ContainerStarted","Data":"cad53cb5b29109d8526bab0dc3a284f7c361083a39e88147048dc00c10912cb9"} Apr 22 19:31:12.831981 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:12.831937 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-nzqdk" podStartSLOduration=1.567305904 podStartE2EDuration="5.831921142s" podCreationTimestamp="2026-04-22 19:31:07 +0000 UTC" firstStartedPulling="2026-04-22 19:31:08.165271394 +0000 UTC m=+446.359465967" lastFinishedPulling="2026-04-22 19:31:12.42988664 +0000 UTC m=+450.624081205" observedRunningTime="2026-04-22 19:31:12.830295281 +0000 UTC m=+451.024489882" watchObservedRunningTime="2026-04-22 19:31:12.831921142 +0000 UTC m=+451.026115733" Apr 22 19:31:15.824425 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:15.824391 2564 generic.go:358] "Generic (PLEG): container finished" podID="84048b42-efc8-4ee4-a171-ff4627d9d2e7" containerID="cad53cb5b29109d8526bab0dc3a284f7c361083a39e88147048dc00c10912cb9" exitCode=0 Apr 22 19:31:15.824830 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:15.824455 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nzqdk" event={"ID":"84048b42-efc8-4ee4-a171-ff4627d9d2e7","Type":"ContainerDied","Data":"cad53cb5b29109d8526bab0dc3a284f7c361083a39e88147048dc00c10912cb9"} Apr 22 19:31:16.944895 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:16.944875 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nzqdk" Apr 22 19:31:17.068844 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:17.068813 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xs8b\" (UniqueName: \"kubernetes.io/projected/84048b42-efc8-4ee4-a171-ff4627d9d2e7-kube-api-access-9xs8b\") pod \"84048b42-efc8-4ee4-a171-ff4627d9d2e7\" (UID: \"84048b42-efc8-4ee4-a171-ff4627d9d2e7\") " Apr 22 19:31:17.070877 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:17.070854 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84048b42-efc8-4ee4-a171-ff4627d9d2e7-kube-api-access-9xs8b" (OuterVolumeSpecName: "kube-api-access-9xs8b") pod "84048b42-efc8-4ee4-a171-ff4627d9d2e7" (UID: "84048b42-efc8-4ee4-a171-ff4627d9d2e7"). InnerVolumeSpecName "kube-api-access-9xs8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:31:17.170231 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:17.170152 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xs8b\" (UniqueName: \"kubernetes.io/projected/84048b42-efc8-4ee4-a171-ff4627d9d2e7-kube-api-access-9xs8b\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:31:17.831079 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:17.831045 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nzqdk" event={"ID":"84048b42-efc8-4ee4-a171-ff4627d9d2e7","Type":"ContainerDied","Data":"d8fe3aec73fb67b61678bfab7dd7e0beb2f5de342f12dda74e6b352589530c82"} Apr 22 19:31:17.831079 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:17.831068 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nzqdk" Apr 22 19:31:17.831079 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:17.831077 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8fe3aec73fb67b61678bfab7dd7e0beb2f5de342f12dda74e6b352589530c82" Apr 22 19:31:26.919817 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.919785 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt"] Apr 22 19:31:26.920181 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.920101 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84048b42-efc8-4ee4-a171-ff4627d9d2e7" containerName="s3-init" Apr 22 19:31:26.920181 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.920111 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="84048b42-efc8-4ee4-a171-ff4627d9d2e7" containerName="s3-init" Apr 22 19:31:26.920181 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.920177 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="84048b42-efc8-4ee4-a171-ff4627d9d2e7" containerName="s3-init" Apr 22 19:31:26.923213 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.923196 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" Apr 22 19:31:26.925533 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.925516 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5ktbc\"" Apr 22 19:31:26.932801 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.932783 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" Apr 22 19:31:26.933099 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:26.933079 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt"] Apr 22 19:31:27.043635 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.043610 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt"] Apr 22 19:31:27.046294 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:31:27.046266 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a881ab_2edc_475b_bdaf_2dc283b1b9e9.slice/crio-a0691f422d2d444f253b507331e49f1808e75bd5318157e373ed6202cb15585e WatchSource:0}: Error finding container a0691f422d2d444f253b507331e49f1808e75bd5318157e373ed6202cb15585e: Status 404 returned error can't find the container with id a0691f422d2d444f253b507331e49f1808e75bd5318157e373ed6202cb15585e Apr 22 19:31:27.501235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.501200 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd"] Apr 22 19:31:27.505383 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.505361 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:31:27.514183 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.514153 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd"] Apr 22 19:31:27.660712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.660665 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca101244-9dd1-4827-887a-d36038109281-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd\" (UID: \"ca101244-9dd1-4827-887a-d36038109281\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:31:27.761763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.761649 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca101244-9dd1-4827-887a-d36038109281-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd\" (UID: \"ca101244-9dd1-4827-887a-d36038109281\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:31:27.762188 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.762121 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca101244-9dd1-4827-887a-d36038109281-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd\" (UID: \"ca101244-9dd1-4827-887a-d36038109281\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:31:27.820168 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.820125 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:31:27.869627 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:27.869565 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" event={"ID":"02a881ab-2edc-475b-bdaf-2dc283b1b9e9","Type":"ContainerStarted","Data":"a0691f422d2d444f253b507331e49f1808e75bd5318157e373ed6202cb15585e"} Apr 22 19:31:28.005434 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:28.005029 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd"] Apr 22 19:31:28.008386 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:31:28.008354 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca101244_9dd1_4827_887a_d36038109281.slice/crio-22948ec4d1f17ba2bfc3b6af5130cc76bc0e68c2a6de6fd1b09bbe052ae19c70 WatchSource:0}: Error finding container 22948ec4d1f17ba2bfc3b6af5130cc76bc0e68c2a6de6fd1b09bbe052ae19c70: Status 404 returned error can't find the container with id 22948ec4d1f17ba2bfc3b6af5130cc76bc0e68c2a6de6fd1b09bbe052ae19c70 Apr 22 19:31:28.879303 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:28.879265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" event={"ID":"ca101244-9dd1-4827-887a-d36038109281","Type":"ContainerStarted","Data":"22948ec4d1f17ba2bfc3b6af5130cc76bc0e68c2a6de6fd1b09bbe052ae19c70"} Apr 22 19:31:34.906170 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:34.906131 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" event={"ID":"ca101244-9dd1-4827-887a-d36038109281","Type":"ContainerStarted","Data":"bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350"} Apr 22 19:31:39.924673 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:39.924629 2564 generic.go:358] "Generic (PLEG): container finished" podID="ca101244-9dd1-4827-887a-d36038109281" containerID="bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350" exitCode=0 Apr 22 19:31:39.925187 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:39.924727 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" event={"ID":"ca101244-9dd1-4827-887a-d36038109281","Type":"ContainerDied","Data":"bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350"} Apr 22 19:31:41.932922 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:41.932880 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" event={"ID":"02a881ab-2edc-475b-bdaf-2dc283b1b9e9","Type":"ContainerStarted","Data":"70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b"} Apr 22 19:31:41.933362 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:41.933150 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" Apr 22 19:31:41.934405 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:41.934347 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:31:41.947424 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:41.947379 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podStartSLOduration=1.901985366 podStartE2EDuration="15.947363349s" podCreationTimestamp="2026-04-22 19:31:26 +0000 UTC" firstStartedPulling="2026-04-22 19:31:27.048013336 +0000 UTC m=+465.242207904" lastFinishedPulling="2026-04-22 19:31:41.093391318 +0000 UTC m=+479.287585887" observedRunningTime="2026-04-22 19:31:41.946958036 +0000 UTC m=+480.141152638" watchObservedRunningTime="2026-04-22 19:31:41.947363349 +0000 UTC m=+480.141557939" Apr 22 19:31:42.937000 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:42.936959 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:31:46.951191 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:46.951107 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" event={"ID":"ca101244-9dd1-4827-887a-d36038109281","Type":"ContainerStarted","Data":"72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae"} Apr 22 19:31:46.951598 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:46.951445 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:31:46.952601 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:46.952572 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:31:46.967953 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:46.967317 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podStartSLOduration=1.350959652 podStartE2EDuration="19.967302358s" podCreationTimestamp="2026-04-22 19:31:27 +0000 UTC" firstStartedPulling="2026-04-22 19:31:28.01132884 +0000 UTC m=+466.205523421" lastFinishedPulling="2026-04-22 19:31:46.627671549 +0000 UTC m=+484.821866127" observedRunningTime="2026-04-22 19:31:46.966486615 +0000 UTC m=+485.160681201" watchObservedRunningTime="2026-04-22 19:31:46.967302358 +0000 UTC m=+485.161496946" Apr 22 19:31:47.955392 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:47.955354 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:31:52.937182 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:52.937141 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:31:57.955643 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:31:57.955600 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:32:02.937130 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:02.937083 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:32:07.956249 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:07.956200 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:32:12.937525 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:12.937478 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:32:17.955603 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:17.955561 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:32:22.937934 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:22.937892 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:32:27.955756 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:27.955707 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:32:32.938407 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:32.938335 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" Apr 22 19:32:37.955990 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:37.955942 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:32:46.949442 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:46.949407 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5"] Apr 22 19:32:46.952707 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:46.952681 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:46.955043 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:46.955020 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b2ca6-kube-rbac-proxy-sar-config\"" Apr 22 19:32:46.955142 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:46.955122 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b2ca6-serving-cert\"" Apr 22 19:32:46.955469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:46.955455 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:32:46.959529 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:46.959497 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5"] Apr 22 19:32:47.038198 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.038170 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e41a248-5ac8-4db6-937f-f5ac7023991e-proxy-tls\") pod \"switch-graph-b2ca6-6684fd76f-4dcj5\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.038198 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.038203 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e41a248-5ac8-4db6-937f-f5ac7023991e-openshift-service-ca-bundle\") pod \"switch-graph-b2ca6-6684fd76f-4dcj5\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.139047 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.139010 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e41a248-5ac8-4db6-937f-f5ac7023991e-proxy-tls\") pod \"switch-graph-b2ca6-6684fd76f-4dcj5\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.139218 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.139058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e41a248-5ac8-4db6-937f-f5ac7023991e-openshift-service-ca-bundle\") pod \"switch-graph-b2ca6-6684fd76f-4dcj5\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.139845 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.139818 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e41a248-5ac8-4db6-937f-f5ac7023991e-openshift-service-ca-bundle\") pod \"switch-graph-b2ca6-6684fd76f-4dcj5\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.141292 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.141270 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e41a248-5ac8-4db6-937f-f5ac7023991e-proxy-tls\") pod \"switch-graph-b2ca6-6684fd76f-4dcj5\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.263981 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.263878 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:47.387098 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.387065 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5"] Apr 22 19:32:47.390553 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:32:47.390515 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e41a248_5ac8_4db6_937f_f5ac7023991e.slice/crio-df3775153601e05091d04bc752dc1eaf06a6ef76fbf18fb2b0bf30506ba7d1a3 WatchSource:0}: Error finding container df3775153601e05091d04bc752dc1eaf06a6ef76fbf18fb2b0bf30506ba7d1a3: Status 404 returned error can't find the container with id df3775153601e05091d04bc752dc1eaf06a6ef76fbf18fb2b0bf30506ba7d1a3 Apr 22 19:32:47.955975 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:47.955929 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:32:48.133494 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:48.133455 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" event={"ID":"8e41a248-5ac8-4db6-937f-f5ac7023991e","Type":"ContainerStarted","Data":"df3775153601e05091d04bc752dc1eaf06a6ef76fbf18fb2b0bf30506ba7d1a3"} Apr 22 19:32:51.144072 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:51.144039 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" event={"ID":"8e41a248-5ac8-4db6-937f-f5ac7023991e","Type":"ContainerStarted","Data":"823f6b86adec03a5058c771ba1a632f8ca8d56007b0d340b025520676d40633b"} Apr 22 19:32:51.144486 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:51.144176 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:51.159790 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:51.159747 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podStartSLOduration=2.262186856 podStartE2EDuration="5.159731785s" podCreationTimestamp="2026-04-22 19:32:46 +0000 UTC" firstStartedPulling="2026-04-22 19:32:47.392457711 +0000 UTC m=+545.586652291" lastFinishedPulling="2026-04-22 19:32:50.290002653 +0000 UTC m=+548.484197220" observedRunningTime="2026-04-22 19:32:51.159329623 +0000 UTC m=+549.353524222" watchObservedRunningTime="2026-04-22 19:32:51.159731785 +0000 UTC m=+549.353926373" Apr 22 19:32:57.152014 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:57.151985 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:32:57.957372 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:32:57.957343 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:33:01.158237 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.158206 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5"] Apr 22 19:33:01.158664 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.158422 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" containerID="cri-o://823f6b86adec03a5058c771ba1a632f8ca8d56007b0d340b025520676d40633b" gracePeriod=30 Apr 22 19:33:01.276295 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.276261 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt"] Apr 22 19:33:01.276521 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.276498 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" containerID="cri-o://70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b" gracePeriod=30 Apr 22 19:33:01.330204 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.330174 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6"] Apr 22 19:33:01.333822 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.333801 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" Apr 22 19:33:01.345837 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.345811 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6"] Apr 22 19:33:01.346541 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.346517 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" Apr 22 19:33:01.468591 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:01.468555 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6"] Apr 22 19:33:01.471673 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:33:01.471646 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a370686_f446_4ece_b109_3f444ec79f60.slice/crio-7d2f268202d2ab07dce3de6eb97369776bc227bb92ad9c281cde3d47d66b36ed WatchSource:0}: Error finding container 7d2f268202d2ab07dce3de6eb97369776bc227bb92ad9c281cde3d47d66b36ed: Status 404 returned error can't find the container with id 7d2f268202d2ab07dce3de6eb97369776bc227bb92ad9c281cde3d47d66b36ed Apr 22 19:33:02.151012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.150972 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:02.179393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.179360 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" event={"ID":"7a370686-f446-4ece-b109-3f444ec79f60","Type":"ContainerStarted","Data":"27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6"} Apr 22 19:33:02.179393 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.179396 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" event={"ID":"7a370686-f446-4ece-b109-3f444ec79f60","Type":"ContainerStarted","Data":"7d2f268202d2ab07dce3de6eb97369776bc227bb92ad9c281cde3d47d66b36ed"} Apr 22 19:33:02.179897 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.179586 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" Apr 22 19:33:02.180818 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.180793 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 19:33:02.200853 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.200809 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podStartSLOduration=1.200795922 podStartE2EDuration="1.200795922s" podCreationTimestamp="2026-04-22 19:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:02.200310215 +0000 UTC m=+560.394504798" watchObservedRunningTime="2026-04-22 19:33:02.200795922 +0000 UTC m=+560.394990570" Apr 22 19:33:02.937769 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:02.937721 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 19:33:03.183126 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:03.183088 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 19:33:04.430877 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:04.430854 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" Apr 22 19:33:05.188888 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.188858 2564 generic.go:358] "Generic (PLEG): container finished" podID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerID="70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b" exitCode=0 Apr 22 19:33:05.189064 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.188918 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" Apr 22 19:33:05.189064 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.188921 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" event={"ID":"02a881ab-2edc-475b-bdaf-2dc283b1b9e9","Type":"ContainerDied","Data":"70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b"} Apr 22 19:33:05.189064 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.189007 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt" event={"ID":"02a881ab-2edc-475b-bdaf-2dc283b1b9e9","Type":"ContainerDied","Data":"a0691f422d2d444f253b507331e49f1808e75bd5318157e373ed6202cb15585e"} Apr 22 19:33:05.189064 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.189025 2564 scope.go:117] "RemoveContainer" containerID="70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b" Apr 22 19:33:05.197512 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.197490 2564 scope.go:117] "RemoveContainer" containerID="70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b" Apr 22 19:33:05.197784 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:33:05.197754 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b\": container with ID starting with 70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b not found: ID does not exist" containerID="70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b" Apr 22 19:33:05.197886 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.197784 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b"} err="failed to get container status \"70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b\": rpc error: code = NotFound desc = could not find container \"70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b\": container with ID starting with 70368c6c54bedb89bde442cb1a6bf5152536e9482721191c2f92eba75571f54b not found: ID does not exist" Apr 22 19:33:05.211514 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.211488 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt"] Apr 22 19:33:05.215766 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:05.215747 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2ca6-predictor-5c9ff77945-6n7qt"] Apr 22 19:33:06.367261 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:06.367228 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" path="/var/lib/kubelet/pods/02a881ab-2edc-475b-bdaf-2dc283b1b9e9/volumes" Apr 22 19:33:07.151592 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:07.151551 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:12.150967 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:12.150929 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:12.151379 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:12.151028 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:33:13.183098 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:13.183056 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 19:33:17.151230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:17.151182 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:22.152716 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:22.152656 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:23.183965 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:23.183922 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 19:33:27.150957 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:27.150916 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:31.271557 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.271527 2564 generic.go:358] "Generic (PLEG): container finished" podID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerID="823f6b86adec03a5058c771ba1a632f8ca8d56007b0d340b025520676d40633b" exitCode=137 Apr 22 19:33:31.271884 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.271594 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" event={"ID":"8e41a248-5ac8-4db6-937f-f5ac7023991e","Type":"ContainerDied","Data":"823f6b86adec03a5058c771ba1a632f8ca8d56007b0d340b025520676d40633b"} Apr 22 19:33:31.321538 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.321516 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:33:31.403879 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.403851 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e41a248-5ac8-4db6-937f-f5ac7023991e-proxy-tls\") pod \"8e41a248-5ac8-4db6-937f-f5ac7023991e\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " Apr 22 19:33:31.404022 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.403895 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e41a248-5ac8-4db6-937f-f5ac7023991e-openshift-service-ca-bundle\") pod \"8e41a248-5ac8-4db6-937f-f5ac7023991e\" (UID: \"8e41a248-5ac8-4db6-937f-f5ac7023991e\") " Apr 22 19:33:31.404248 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.404222 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e41a248-5ac8-4db6-937f-f5ac7023991e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8e41a248-5ac8-4db6-937f-f5ac7023991e" (UID: "8e41a248-5ac8-4db6-937f-f5ac7023991e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:31.406002 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.405979 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e41a248-5ac8-4db6-937f-f5ac7023991e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8e41a248-5ac8-4db6-937f-f5ac7023991e" (UID: "8e41a248-5ac8-4db6-937f-f5ac7023991e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:33:31.504532 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.504455 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e41a248-5ac8-4db6-937f-f5ac7023991e-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:33:31.504532 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:31.504483 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e41a248-5ac8-4db6-937f-f5ac7023991e-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:33:32.275888 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:32.275856 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" event={"ID":"8e41a248-5ac8-4db6-937f-f5ac7023991e","Type":"ContainerDied","Data":"df3775153601e05091d04bc752dc1eaf06a6ef76fbf18fb2b0bf30506ba7d1a3"} Apr 22 19:33:32.276307 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:32.275898 2564 scope.go:117] "RemoveContainer" containerID="823f6b86adec03a5058c771ba1a632f8ca8d56007b0d340b025520676d40633b" Apr 22 19:33:32.276307 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:32.275910 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5" Apr 22 19:33:32.298113 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:32.298088 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5"] Apr 22 19:33:32.301764 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:32.301738 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b2ca6-6684fd76f-4dcj5"] Apr 22 19:33:32.368469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:32.368436 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" path="/var/lib/kubelet/pods/8e41a248-5ac8-4db6-937f-f5ac7023991e/volumes" Apr 22 19:33:33.183930 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:33.183887 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 19:33:36.996239 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996197 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7"] Apr 22 19:33:36.996747 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996731 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" Apr 22 19:33:36.996747 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996749 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" Apr 22 19:33:36.996820 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996777 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" Apr 22 19:33:36.996820 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996783 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" Apr 22 19:33:36.996880 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996840 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="02a881ab-2edc-475b-bdaf-2dc283b1b9e9" containerName="kserve-container" Apr 22 19:33:36.996880 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.996850 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e41a248-5ac8-4db6-937f-f5ac7023991e" containerName="switch-graph-b2ca6" Apr 22 19:33:36.998593 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:36.998579 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.001063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.001044 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 22 19:33:37.001063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.001057 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 22 19:33:37.001217 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.001076 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:33:37.008886 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.008865 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7"] Apr 22 19:33:37.040998 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.040964 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.041172 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.041014 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cd2779-fa41-433d-9d78-f64350441e58-openshift-service-ca-bundle\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.141792 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.141760 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.141968 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.141809 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cd2779-fa41-433d-9d78-f64350441e58-openshift-service-ca-bundle\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.141968 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:33:37.141898 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 22 19:33:37.142038 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:33:37.141971 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls podName:10cd2779-fa41-433d-9d78-f64350441e58 nodeName:}" failed. No retries permitted until 2026-04-22 19:33:37.641954842 +0000 UTC m=+595.836149407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls") pod "model-chainer-766d6fc4c8-mvrn7" (UID: "10cd2779-fa41-433d-9d78-f64350441e58") : secret "model-chainer-serving-cert" not found Apr 22 19:33:37.142415 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.142398 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cd2779-fa41-433d-9d78-f64350441e58-openshift-service-ca-bundle\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.646689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.646650 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.649018 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.648992 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls\") pod \"model-chainer-766d6fc4c8-mvrn7\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:37.909842 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:37.909740 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:38.021435 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:38.021409 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7"] Apr 22 19:33:38.023505 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:33:38.023480 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10cd2779_fa41_433d_9d78_f64350441e58.slice/crio-196b43e63454b11a21a8efb593550993ae83574bc2ec83e4ded20cbe380d22ab WatchSource:0}: Error finding container 196b43e63454b11a21a8efb593550993ae83574bc2ec83e4ded20cbe380d22ab: Status 404 returned error can't find the container with id 196b43e63454b11a21a8efb593550993ae83574bc2ec83e4ded20cbe380d22ab Apr 22 19:33:38.296206 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:38.296118 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" event={"ID":"10cd2779-fa41-433d-9d78-f64350441e58","Type":"ContainerStarted","Data":"6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206"} Apr 22 19:33:38.296206 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:38.296164 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" event={"ID":"10cd2779-fa41-433d-9d78-f64350441e58","Type":"ContainerStarted","Data":"196b43e63454b11a21a8efb593550993ae83574bc2ec83e4ded20cbe380d22ab"} Apr 22 19:33:38.296379 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:38.296247 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:38.313827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:38.313784 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podStartSLOduration=2.313770614 podStartE2EDuration="2.313770614s" podCreationTimestamp="2026-04-22 19:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:38.311614808 +0000 UTC m=+596.505809397" watchObservedRunningTime="2026-04-22 19:33:38.313770614 +0000 UTC m=+596.507965200" Apr 22 19:33:42.279292 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:42.279266 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:33:42.280335 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:42.280312 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:33:43.183822 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:43.183776 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 19:33:44.304085 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:44.304057 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:47.161711 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.161672 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7"] Apr 22 19:33:47.162135 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.161906 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" containerID="cri-o://6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206" gracePeriod=30 Apr 22 19:33:47.385216 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.385176 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd"] Apr 22 19:33:47.385510 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.385485 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" containerID="cri-o://72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae" gracePeriod=30 Apr 22 19:33:47.412895 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.412817 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd"] Apr 22 19:33:47.416261 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.416238 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" Apr 22 19:33:47.422291 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.422168 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd"] Apr 22 19:33:47.427198 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.427176 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" Apr 22 19:33:47.565627 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.565413 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd"] Apr 22 19:33:47.568423 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:33:47.568396 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2cc0be_1178_469d_ba40_8ca28096298a.slice/crio-761e02da7e77c661f8c8aebe451bdcd9f65a4ed666da10db70b4e19c0d420884 WatchSource:0}: Error finding container 761e02da7e77c661f8c8aebe451bdcd9f65a4ed666da10db70b4e19c0d420884: Status 404 returned error can't find the container with id 761e02da7e77c661f8c8aebe451bdcd9f65a4ed666da10db70b4e19c0d420884 Apr 22 19:33:47.955712 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:47.955531 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 19:33:48.327030 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:48.326992 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" event={"ID":"4f2cc0be-1178-469d-ba40-8ca28096298a","Type":"ContainerStarted","Data":"3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5"} Apr 22 19:33:48.327030 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:48.327036 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" event={"ID":"4f2cc0be-1178-469d-ba40-8ca28096298a","Type":"ContainerStarted","Data":"761e02da7e77c661f8c8aebe451bdcd9f65a4ed666da10db70b4e19c0d420884"} Apr 22 19:33:48.327485 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:48.327170 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" Apr 22 19:33:48.328585 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:48.328559 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:33:48.344540 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:48.344498 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podStartSLOduration=1.344484403 podStartE2EDuration="1.344484403s" podCreationTimestamp="2026-04-22 19:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:48.343106964 +0000 UTC m=+606.537301555" watchObservedRunningTime="2026-04-22 19:33:48.344484403 +0000 UTC m=+606.538678989" Apr 22 19:33:49.303231 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:49.303189 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:49.329733 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:49.329683 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:33:51.821584 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:51.821557 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:33:51.870972 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:51.870944 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca101244-9dd1-4827-887a-d36038109281-kserve-provision-location\") pod \"ca101244-9dd1-4827-887a-d36038109281\" (UID: \"ca101244-9dd1-4827-887a-d36038109281\") " Apr 22 19:33:51.871290 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:51.871271 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca101244-9dd1-4827-887a-d36038109281-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca101244-9dd1-4827-887a-d36038109281" (UID: "ca101244-9dd1-4827-887a-d36038109281"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:51.972203 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:51.972124 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca101244-9dd1-4827-887a-d36038109281-kserve-provision-location\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:33:52.340170 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.340132 2564 generic.go:358] "Generic (PLEG): container finished" podID="ca101244-9dd1-4827-887a-d36038109281" containerID="72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae" exitCode=0 Apr 22 19:33:52.340315 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.340215 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" Apr 22 19:33:52.340315 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.340214 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" event={"ID":"ca101244-9dd1-4827-887a-d36038109281","Type":"ContainerDied","Data":"72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae"} Apr 22 19:33:52.340315 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.340258 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd" event={"ID":"ca101244-9dd1-4827-887a-d36038109281","Type":"ContainerDied","Data":"22948ec4d1f17ba2bfc3b6af5130cc76bc0e68c2a6de6fd1b09bbe052ae19c70"} Apr 22 19:33:52.340315 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.340278 2564 scope.go:117] "RemoveContainer" containerID="72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae" Apr 22 19:33:52.349560 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.349531 2564 scope.go:117] "RemoveContainer" containerID="bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350" Apr 22 19:33:52.357192 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.357174 2564 scope.go:117] "RemoveContainer" containerID="72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae" Apr 22 19:33:52.357429 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:33:52.357413 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae\": container with ID starting with 72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae not found: ID does not exist" containerID="72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae" Apr 22 19:33:52.357482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.357435 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae"} err="failed to get container status \"72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae\": rpc error: code = NotFound desc = could not find container \"72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae\": container with ID starting with 72f9f447a61be078d13812e18852f27cea46248327029751154be5e7c2173cae not found: ID does not exist" Apr 22 19:33:52.357482 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.357448 2564 scope.go:117] "RemoveContainer" containerID="bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350" Apr 22 19:33:52.357663 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:33:52.357646 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350\": container with ID starting with bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350 not found: ID does not exist" containerID="bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350" Apr 22 19:33:52.357719 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.357669 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350"} err="failed to get container status \"bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350\": rpc error: code = NotFound desc = could not find container \"bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350\": container with ID starting with bd9c40774680a0f9a01c8b8a25217a92f22c95744f49265431a1df8ff0d42350 not found: ID does not exist" Apr 22 19:33:52.368075 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.368054 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd"] Apr 22 19:33:52.370492 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:52.370471 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-5dfcf87bd5-vw8nd"] Apr 22 19:33:53.184228 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:53.184199 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" Apr 22 19:33:54.303171 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:54.303125 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:54.367795 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:54.367767 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca101244-9dd1-4827-887a-d36038109281" path="/var/lib/kubelet/pods/ca101244-9dd1-4827-887a-d36038109281/volumes" Apr 22 19:33:59.302841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:59.302798 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:33:59.303277 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:59.302922 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:33:59.329921 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:33:59.329875 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:34:04.303397 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:04.303317 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:09.303190 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:09.303150 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:09.330555 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:09.330521 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:34:11.400423 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.400392 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz"] Apr 22 19:34:11.400834 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.400819 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="storage-initializer" Apr 22 19:34:11.400834 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.400835 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="storage-initializer" Apr 22 19:34:11.400923 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.400842 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" Apr 22 19:34:11.400923 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.400848 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" Apr 22 19:34:11.400923 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.400914 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca101244-9dd1-4827-887a-d36038109281" containerName="kserve-container" Apr 22 19:34:11.402775 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.402755 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.405238 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.405218 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1be4a-kube-rbac-proxy-sar-config\"" Apr 22 19:34:11.405389 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.405374 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1be4a-serving-cert\"" Apr 22 19:34:11.410765 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.410746 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz"] Apr 22 19:34:11.550519 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.550481 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a1fe01-7206-49cb-9307-29235a9d17e8-openshift-service-ca-bundle\") pod \"switch-graph-1be4a-d6b9df876-tkptz\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.550729 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.550599 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a1fe01-7206-49cb-9307-29235a9d17e8-proxy-tls\") pod \"switch-graph-1be4a-d6b9df876-tkptz\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.652043 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.651955 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a1fe01-7206-49cb-9307-29235a9d17e8-openshift-service-ca-bundle\") pod \"switch-graph-1be4a-d6b9df876-tkptz\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.652043 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.652037 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a1fe01-7206-49cb-9307-29235a9d17e8-proxy-tls\") pod \"switch-graph-1be4a-d6b9df876-tkptz\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.652603 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.652577 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a1fe01-7206-49cb-9307-29235a9d17e8-openshift-service-ca-bundle\") pod \"switch-graph-1be4a-d6b9df876-tkptz\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.654266 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.654241 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a1fe01-7206-49cb-9307-29235a9d17e8-proxy-tls\") pod \"switch-graph-1be4a-d6b9df876-tkptz\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.712745 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.712685 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:11.828046 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:11.827946 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz"] Apr 22 19:34:11.830715 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:34:11.830675 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a1fe01_7206_49cb_9307_29235a9d17e8.slice/crio-29c7738bc5e21716cfb2ca2565698cb0e6eb7eac5db2d9ce258dcd4c638226cd WatchSource:0}: Error finding container 29c7738bc5e21716cfb2ca2565698cb0e6eb7eac5db2d9ce258dcd4c638226cd: Status 404 returned error can't find the container with id 29c7738bc5e21716cfb2ca2565698cb0e6eb7eac5db2d9ce258dcd4c638226cd Apr 22 19:34:12.407889 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:12.407855 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" event={"ID":"06a1fe01-7206-49cb-9307-29235a9d17e8","Type":"ContainerStarted","Data":"66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c"} Apr 22 19:34:12.407889 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:12.407888 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" event={"ID":"06a1fe01-7206-49cb-9307-29235a9d17e8","Type":"ContainerStarted","Data":"29c7738bc5e21716cfb2ca2565698cb0e6eb7eac5db2d9ce258dcd4c638226cd"} Apr 22 19:34:12.408331 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:12.407910 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:12.425018 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:12.424974 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podStartSLOduration=1.424954476 podStartE2EDuration="1.424954476s" podCreationTimestamp="2026-04-22 19:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:34:12.424462097 +0000 UTC m=+630.618656708" watchObservedRunningTime="2026-04-22 19:34:12.424954476 +0000 UTC m=+630.619149063" Apr 22 19:34:14.302893 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:14.302854 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:17.297616 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.297592 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:34:17.401942 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.401910 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cd2779-fa41-433d-9d78-f64350441e58-openshift-service-ca-bundle\") pod \"10cd2779-fa41-433d-9d78-f64350441e58\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " Apr 22 19:34:17.402101 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.402045 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls\") pod \"10cd2779-fa41-433d-9d78-f64350441e58\" (UID: \"10cd2779-fa41-433d-9d78-f64350441e58\") " Apr 22 19:34:17.402277 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.402254 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cd2779-fa41-433d-9d78-f64350441e58-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "10cd2779-fa41-433d-9d78-f64350441e58" (UID: "10cd2779-fa41-433d-9d78-f64350441e58"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:17.404006 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.403985 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "10cd2779-fa41-433d-9d78-f64350441e58" (UID: "10cd2779-fa41-433d-9d78-f64350441e58"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:34:17.422653 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.422584 2564 generic.go:358] "Generic (PLEG): container finished" podID="10cd2779-fa41-433d-9d78-f64350441e58" containerID="6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206" exitCode=0 Apr 22 19:34:17.422653 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.422646 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" Apr 22 19:34:17.422823 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.422654 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" event={"ID":"10cd2779-fa41-433d-9d78-f64350441e58","Type":"ContainerDied","Data":"6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206"} Apr 22 19:34:17.422823 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.422679 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7" event={"ID":"10cd2779-fa41-433d-9d78-f64350441e58","Type":"ContainerDied","Data":"196b43e63454b11a21a8efb593550993ae83574bc2ec83e4ded20cbe380d22ab"} Apr 22 19:34:17.422823 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.422714 2564 scope.go:117] "RemoveContainer" containerID="6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206" Apr 22 19:34:17.430449 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.430428 2564 scope.go:117] "RemoveContainer" containerID="6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206" Apr 22 19:34:17.430719 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:34:17.430687 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206\": container with ID starting with 6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206 not found: ID does not exist" containerID="6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206" Apr 22 19:34:17.430771 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.430725 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206"} err="failed to get container status \"6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206\": rpc error: code = NotFound desc = could not find container \"6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206\": container with ID starting with 6313c346376b8dd4f2fa8b5db02aa6a3d1ab2814e65b63a3cd04f41248130206 not found: ID does not exist" Apr 22 19:34:17.443438 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.443413 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7"] Apr 22 19:34:17.447500 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.447482 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-766d6fc4c8-mvrn7"] Apr 22 19:34:17.503329 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.503302 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10cd2779-fa41-433d-9d78-f64350441e58-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:34:17.503329 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:17.503327 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cd2779-fa41-433d-9d78-f64350441e58-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:34:18.367493 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:18.367449 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cd2779-fa41-433d-9d78-f64350441e58" path="/var/lib/kubelet/pods/10cd2779-fa41-433d-9d78-f64350441e58/volumes" Apr 22 19:34:18.415612 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:18.415587 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:34:19.329846 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:19.329801 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:34:29.330793 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:29.330747 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:34:39.330874 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:39.330837 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" Apr 22 19:34:57.369156 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.369126 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j"] Apr 22 19:34:57.369590 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.369482 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" Apr 22 19:34:57.369590 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.369494 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" Apr 22 19:34:57.369590 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.369553 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="10cd2779-fa41-433d-9d78-f64350441e58" containerName="model-chainer" Apr 22 19:34:57.372969 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.372954 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:57.375673 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.375646 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-22a3d-kube-rbac-proxy-sar-config\"" Apr 22 19:34:57.375806 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.375689 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-22a3d-serving-cert\"" Apr 22 19:34:57.380827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.380806 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j"] Apr 22 19:34:57.452516 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.452476 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a05835-34d1-47b6-a049-99a8222ecc59-openshift-service-ca-bundle\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:57.452689 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.452526 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:57.553266 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.553237 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a05835-34d1-47b6-a049-99a8222ecc59-openshift-service-ca-bundle\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:57.553430 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.553282 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:57.553430 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:34:57.553411 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-22a3d-serving-cert: secret "sequence-graph-22a3d-serving-cert" not found Apr 22 19:34:57.553541 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:34:57.553465 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls podName:37a05835-34d1-47b6-a049-99a8222ecc59 nodeName:}" failed. No retries permitted until 2026-04-22 19:34:58.053450895 +0000 UTC m=+676.247645461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls") pod "sequence-graph-22a3d-5498b7b956-qwg8j" (UID: "37a05835-34d1-47b6-a049-99a8222ecc59") : secret "sequence-graph-22a3d-serving-cert" not found Apr 22 19:34:57.553850 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:57.553831 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a05835-34d1-47b6-a049-99a8222ecc59-openshift-service-ca-bundle\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:58.057772 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.057738 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:58.060044 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.060019 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls\") pod \"sequence-graph-22a3d-5498b7b956-qwg8j\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:58.283831 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.283785 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:58.401651 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.401629 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j"] Apr 22 19:34:58.403892 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:34:58.403868 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a05835_34d1_47b6_a049_99a8222ecc59.slice/crio-dee2698fa1e2b2428062bc891e5c74b0cb089822ee520b83416dd79c9fed09d3 WatchSource:0}: Error finding container dee2698fa1e2b2428062bc891e5c74b0cb089822ee520b83416dd79c9fed09d3: Status 404 returned error can't find the container with id dee2698fa1e2b2428062bc891e5c74b0cb089822ee520b83416dd79c9fed09d3 Apr 22 19:34:58.550394 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.550366 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" event={"ID":"37a05835-34d1-47b6-a049-99a8222ecc59","Type":"ContainerStarted","Data":"ed8ee80328f6a6e33b32eb22c27878b7984b6b90b3b45f85c97dc241bcef4b3f"} Apr 22 19:34:58.550394 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.550398 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" event={"ID":"37a05835-34d1-47b6-a049-99a8222ecc59","Type":"ContainerStarted","Data":"dee2698fa1e2b2428062bc891e5c74b0cb089822ee520b83416dd79c9fed09d3"} Apr 22 19:34:58.550616 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.550589 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:34:58.566928 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:34:58.566886 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podStartSLOduration=1.5668731 podStartE2EDuration="1.5668731s" podCreationTimestamp="2026-04-22 19:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:34:58.56556255 +0000 UTC m=+676.759757138" watchObservedRunningTime="2026-04-22 19:34:58.5668731 +0000 UTC m=+676.761067686" Apr 22 19:35:04.559305 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:35:04.559277 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:38:42.302556 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:38:42.302473 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:38:42.304872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:38:42.304850 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:42:26.048770 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.048741 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz"] Apr 22 19:42:26.051277 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.048954 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" containerID="cri-o://66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c" gracePeriod=30 Apr 22 19:42:26.134928 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.134891 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6"] Apr 22 19:42:26.135175 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.135128 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" containerID="cri-o://27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6" gracePeriod=30 Apr 22 19:42:26.222445 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.222415 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m"] Apr 22 19:42:26.225917 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.225896 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" Apr 22 19:42:26.235520 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.235496 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" Apr 22 19:42:26.241118 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.241096 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m"] Apr 22 19:42:26.355356 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.355320 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m"] Apr 22 19:42:26.358277 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:42:26.358252 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cf7052b_2101_44bd_9c75_b86e4b7314c1.slice/crio-d9f9c7ddb34c46dc3a6e64bf80c26120d4da4f4ddac7131bf3b2de1fef2df74b WatchSource:0}: Error finding container d9f9c7ddb34c46dc3a6e64bf80c26120d4da4f4ddac7131bf3b2de1fef2df74b: Status 404 returned error can't find the container with id d9f9c7ddb34c46dc3a6e64bf80c26120d4da4f4ddac7131bf3b2de1fef2df74b Apr 22 19:42:26.359958 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.359943 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:42:26.873495 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.873462 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" event={"ID":"5cf7052b-2101-44bd-9c75-b86e4b7314c1","Type":"ContainerStarted","Data":"ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7"} Apr 22 19:42:26.873715 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.873502 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" event={"ID":"5cf7052b-2101-44bd-9c75-b86e4b7314c1","Type":"ContainerStarted","Data":"d9f9c7ddb34c46dc3a6e64bf80c26120d4da4f4ddac7131bf3b2de1fef2df74b"} Apr 22 19:42:26.873715 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.873610 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" Apr 22 19:42:26.874748 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.874724 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:42:26.888802 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:26.888764 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podStartSLOduration=0.888749901 podStartE2EDuration="888.749901ms" podCreationTimestamp="2026-04-22 19:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:26.887771392 +0000 UTC m=+1125.081965986" watchObservedRunningTime="2026-04-22 19:42:26.888749901 +0000 UTC m=+1125.082944484" Apr 22 19:42:27.876836 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:27.876797 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:42:28.413929 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:28.413895 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:29.370492 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.370466 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" Apr 22 19:42:29.883524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.883491 2564 generic.go:358] "Generic (PLEG): container finished" podID="7a370686-f446-4ece-b109-3f444ec79f60" containerID="27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6" exitCode=0 Apr 22 19:42:29.883725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.883575 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" Apr 22 19:42:29.883725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.883581 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" event={"ID":"7a370686-f446-4ece-b109-3f444ec79f60","Type":"ContainerDied","Data":"27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6"} Apr 22 19:42:29.883725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.883631 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6" event={"ID":"7a370686-f446-4ece-b109-3f444ec79f60","Type":"ContainerDied","Data":"7d2f268202d2ab07dce3de6eb97369776bc227bb92ad9c281cde3d47d66b36ed"} Apr 22 19:42:29.883725 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.883647 2564 scope.go:117] "RemoveContainer" containerID="27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6" Apr 22 19:42:29.891306 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.891290 2564 scope.go:117] "RemoveContainer" containerID="27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6" Apr 22 19:42:29.891617 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:42:29.891584 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6\": container with ID starting with 27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6 not found: ID does not exist" containerID="27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6" Apr 22 19:42:29.891731 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.891631 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6"} err="failed to get container status \"27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6\": rpc error: code = NotFound desc = could not find container \"27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6\": container with ID starting with 27da5eb7676bd5570e9d5c436283423b4d29590b605b1f1c93c4ed2dac83c2a6 not found: ID does not exist" Apr 22 19:42:29.904742 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.904718 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6"] Apr 22 19:42:29.910642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:29.910608 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1be4a-predictor-789bcb6475-x8cl6"] Apr 22 19:42:30.367309 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:30.367272 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a370686-f446-4ece-b109-3f444ec79f60" path="/var/lib/kubelet/pods/7a370686-f446-4ece-b109-3f444ec79f60/volumes" Apr 22 19:42:33.414452 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:33.414415 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:37.876929 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:37.876887 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:42:38.414204 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:38.414164 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:38.414404 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:38.414277 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:42:43.414719 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:43.414670 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:47.876978 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:47.876939 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:42:48.414527 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:48.414490 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:53.414812 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:53.414771 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:42:56.690172 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.690148 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:42:56.810429 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.810400 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a1fe01-7206-49cb-9307-29235a9d17e8-proxy-tls\") pod \"06a1fe01-7206-49cb-9307-29235a9d17e8\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " Apr 22 19:42:56.810658 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.810464 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a1fe01-7206-49cb-9307-29235a9d17e8-openshift-service-ca-bundle\") pod \"06a1fe01-7206-49cb-9307-29235a9d17e8\" (UID: \"06a1fe01-7206-49cb-9307-29235a9d17e8\") " Apr 22 19:42:56.810825 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.810791 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a1fe01-7206-49cb-9307-29235a9d17e8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "06a1fe01-7206-49cb-9307-29235a9d17e8" (UID: "06a1fe01-7206-49cb-9307-29235a9d17e8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:42:56.812362 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.812336 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a1fe01-7206-49cb-9307-29235a9d17e8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "06a1fe01-7206-49cb-9307-29235a9d17e8" (UID: "06a1fe01-7206-49cb-9307-29235a9d17e8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:42:56.911709 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.911666 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a1fe01-7206-49cb-9307-29235a9d17e8-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:42:56.911709 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.911709 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a1fe01-7206-49cb-9307-29235a9d17e8-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:42:56.965088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.965055 2564 generic.go:358] "Generic (PLEG): container finished" podID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerID="66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c" exitCode=0 Apr 22 19:42:56.965235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.965126 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" Apr 22 19:42:56.965235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.965136 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" event={"ID":"06a1fe01-7206-49cb-9307-29235a9d17e8","Type":"ContainerDied","Data":"66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c"} Apr 22 19:42:56.965235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.965172 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz" event={"ID":"06a1fe01-7206-49cb-9307-29235a9d17e8","Type":"ContainerDied","Data":"29c7738bc5e21716cfb2ca2565698cb0e6eb7eac5db2d9ce258dcd4c638226cd"} Apr 22 19:42:56.965235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.965186 2564 scope.go:117] "RemoveContainer" containerID="66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c" Apr 22 19:42:56.978916 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.978895 2564 scope.go:117] "RemoveContainer" containerID="66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c" Apr 22 19:42:56.979167 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:42:56.979148 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c\": container with ID starting with 66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c not found: ID does not exist" containerID="66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c" Apr 22 19:42:56.979230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.979180 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c"} err="failed to get container status \"66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c\": rpc error: code = NotFound desc = could not find container \"66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c\": container with ID starting with 66ac27d0b38b3c9c62a4d2fc66544852262882c5b65aee166922fd98b13a710c not found: ID does not exist" Apr 22 19:42:56.991127 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.991101 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz"] Apr 22 19:42:56.995932 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:56.995907 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1be4a-d6b9df876-tkptz"] Apr 22 19:42:57.877591 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:57.877545 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:42:58.367136 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:42:58.367104 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" path="/var/lib/kubelet/pods/06a1fe01-7206-49cb-9307-29235a9d17e8/volumes" Apr 22 19:43:07.877756 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:07.877660 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:43:12.148868 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.148833 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j"] Apr 22 19:43:12.149213 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.149058 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" containerID="cri-o://ed8ee80328f6a6e33b32eb22c27878b7984b6b90b3b45f85c97dc241bcef4b3f" gracePeriod=30 Apr 22 19:43:12.238090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.238060 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd"] Apr 22 19:43:12.238366 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.238320 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" containerID="cri-o://3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5" gracePeriod=30 Apr 22 19:43:12.269477 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269446 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h"] Apr 22 19:43:12.269852 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269837 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" Apr 22 19:43:12.269903 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269854 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" Apr 22 19:43:12.269903 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269877 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" Apr 22 19:43:12.269903 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269883 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" Apr 22 19:43:12.269998 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269947 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a370686-f446-4ece-b109-3f444ec79f60" containerName="kserve-container" Apr 22 19:43:12.269998 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.269958 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="06a1fe01-7206-49cb-9307-29235a9d17e8" containerName="switch-graph-1be4a" Apr 22 19:43:12.272887 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.272873 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" Apr 22 19:43:12.279136 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.279111 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h"] Apr 22 19:43:12.282534 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.282517 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" Apr 22 19:43:12.404075 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:12.404053 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h"] Apr 22 19:43:12.407209 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:43:12.407180 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod678037b8_563b_4586_b25c_1d27ecdbfcaa.slice/crio-76bfb352ff34aa21da794c931310ad1f8580eee63e8443a6b9a10d4ecf69edaa WatchSource:0}: Error finding container 76bfb352ff34aa21da794c931310ad1f8580eee63e8443a6b9a10d4ecf69edaa: Status 404 returned error can't find the container with id 76bfb352ff34aa21da794c931310ad1f8580eee63e8443a6b9a10d4ecf69edaa Apr 22 19:43:13.016026 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:13.015993 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" event={"ID":"678037b8-563b-4586-b25c-1d27ecdbfcaa","Type":"ContainerStarted","Data":"54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529"} Apr 22 19:43:13.016026 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:13.016028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" event={"ID":"678037b8-563b-4586-b25c-1d27ecdbfcaa","Type":"ContainerStarted","Data":"76bfb352ff34aa21da794c931310ad1f8580eee63e8443a6b9a10d4ecf69edaa"} Apr 22 19:43:13.016233 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:13.016213 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" Apr 22 19:43:13.017261 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:13.017239 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:43:13.031773 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:13.031736 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podStartSLOduration=1.031723642 podStartE2EDuration="1.031723642s" podCreationTimestamp="2026-04-22 19:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:13.029962531 +0000 UTC m=+1171.224157128" watchObservedRunningTime="2026-04-22 19:43:13.031723642 +0000 UTC m=+1171.225918227" Apr 22 19:43:14.019826 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:14.019787 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:43:14.557409 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:14.557369 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:15.778642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:15.778621 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" Apr 22 19:43:16.025630 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.025544 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerID="3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5" exitCode=0 Apr 22 19:43:16.025630 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.025605 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" Apr 22 19:43:16.025863 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.025629 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" event={"ID":"4f2cc0be-1178-469d-ba40-8ca28096298a","Type":"ContainerDied","Data":"3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5"} Apr 22 19:43:16.025863 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.025670 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd" event={"ID":"4f2cc0be-1178-469d-ba40-8ca28096298a","Type":"ContainerDied","Data":"761e02da7e77c661f8c8aebe451bdcd9f65a4ed666da10db70b4e19c0d420884"} Apr 22 19:43:16.025863 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.025689 2564 scope.go:117] "RemoveContainer" containerID="3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5" Apr 22 19:43:16.033240 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.033227 2564 scope.go:117] "RemoveContainer" containerID="3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5" Apr 22 19:43:16.033446 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:43:16.033429 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5\": container with ID starting with 3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5 not found: ID does not exist" containerID="3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5" Apr 22 19:43:16.033492 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.033454 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5"} err="failed to get container status \"3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5\": rpc error: code = NotFound desc = could not find container \"3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5\": container with ID starting with 3fd26a27d5898520d317606b02cf741e106158aa5852f1461829e894c8cd30a5 not found: ID does not exist" Apr 22 19:43:16.046269 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.046247 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd"] Apr 22 19:43:16.053841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.053811 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-22a3d-predictor-86595f965c-2vstd"] Apr 22 19:43:16.367660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:16.367623 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" path="/var/lib/kubelet/pods/4f2cc0be-1178-469d-ba40-8ca28096298a/volumes" Apr 22 19:43:17.877875 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:17.877843 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" Apr 22 19:43:19.557883 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:19.557840 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:24.020121 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:24.020080 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:43:24.557933 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:24.557893 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:24.558113 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:24.558003 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:43:29.557871 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:29.557830 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:34.020183 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:34.020138 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:43:34.558380 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:34.558336 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:36.313910 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.313881 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx"] Apr 22 19:43:36.314264 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.314236 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" Apr 22 19:43:36.314264 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.314247 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" Apr 22 19:43:36.314332 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.314299 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f2cc0be-1178-469d-ba40-8ca28096298a" containerName="kserve-container" Apr 22 19:43:36.318869 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.318849 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:36.321971 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.321952 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-0213e-kube-rbac-proxy-sar-config\"" Apr 22 19:43:36.322086 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.322071 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-0213e-serving-cert\"" Apr 22 19:43:36.333760 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.333733 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx"] Apr 22 19:43:36.448848 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.448799 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:36.449025 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.448854 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-openshift-service-ca-bundle\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:36.549530 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.549497 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:36.549530 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.549536 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-openshift-service-ca-bundle\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:36.549778 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:43:36.549630 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-0213e-serving-cert: secret "ensemble-graph-0213e-serving-cert" not found Apr 22 19:43:36.549778 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:43:36.549717 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls podName:9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61 nodeName:}" failed. No retries permitted until 2026-04-22 19:43:37.049681385 +0000 UTC m=+1195.243875954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls") pod "ensemble-graph-0213e-9d9695dfb-m2mzx" (UID: "9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61") : secret "ensemble-graph-0213e-serving-cert" not found Apr 22 19:43:36.550112 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:36.550096 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-openshift-service-ca-bundle\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:37.053196 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:37.053156 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:37.053391 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:43:37.053317 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-0213e-serving-cert: secret "ensemble-graph-0213e-serving-cert" not found Apr 22 19:43:37.053461 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:43:37.053393 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls podName:9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61 nodeName:}" failed. No retries permitted until 2026-04-22 19:43:38.053374383 +0000 UTC m=+1196.247568951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls") pod "ensemble-graph-0213e-9d9695dfb-m2mzx" (UID: "9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61") : secret "ensemble-graph-0213e-serving-cert" not found Apr 22 19:43:38.061522 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:38.061480 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:38.063816 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:38.063782 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") pod \"ensemble-graph-0213e-9d9695dfb-m2mzx\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:38.134827 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:38.134796 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:38.248764 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:38.248742 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx"] Apr 22 19:43:38.250855 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:43:38.250829 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d2a5a4c_c2b9_4d78_ba2a_0c5cebeeed61.slice/crio-905166eb2fb642d61a04cc60dc36ad0820f616398b5684b5da67d3d430a54c5b WatchSource:0}: Error finding container 905166eb2fb642d61a04cc60dc36ad0820f616398b5684b5da67d3d430a54c5b: Status 404 returned error can't find the container with id 905166eb2fb642d61a04cc60dc36ad0820f616398b5684b5da67d3d430a54c5b Apr 22 19:43:39.098644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:39.098611 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" event={"ID":"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61","Type":"ContainerStarted","Data":"edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae"} Apr 22 19:43:39.098644 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:39.098646 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" event={"ID":"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61","Type":"ContainerStarted","Data":"905166eb2fb642d61a04cc60dc36ad0820f616398b5684b5da67d3d430a54c5b"} Apr 22 19:43:39.099063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:39.098742 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:39.114824 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:39.114770 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podStartSLOduration=3.114752257 podStartE2EDuration="3.114752257s" podCreationTimestamp="2026-04-22 19:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:39.114627242 +0000 UTC m=+1197.308821829" watchObservedRunningTime="2026-04-22 19:43:39.114752257 +0000 UTC m=+1197.308946845" Apr 22 19:43:39.558094 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:39.558057 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:42.288099 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.288078 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:43:42.332797 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.332768 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:43:42.335454 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.335436 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:43:42.356733 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.356709 2564 scope.go:117] "RemoveContainer" containerID="ed8ee80328f6a6e33b32eb22c27878b7984b6b90b3b45f85c97dc241bcef4b3f" Apr 22 19:43:42.401384 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.401359 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a05835-34d1-47b6-a049-99a8222ecc59-openshift-service-ca-bundle\") pod \"37a05835-34d1-47b6-a049-99a8222ecc59\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " Apr 22 19:43:42.401488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.401471 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls\") pod \"37a05835-34d1-47b6-a049-99a8222ecc59\" (UID: \"37a05835-34d1-47b6-a049-99a8222ecc59\") " Apr 22 19:43:42.401733 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.401707 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a05835-34d1-47b6-a049-99a8222ecc59-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "37a05835-34d1-47b6-a049-99a8222ecc59" (UID: "37a05835-34d1-47b6-a049-99a8222ecc59"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:43:42.403316 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.403273 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "37a05835-34d1-47b6-a049-99a8222ecc59" (UID: "37a05835-34d1-47b6-a049-99a8222ecc59"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:43:42.502883 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.502849 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/37a05835-34d1-47b6-a049-99a8222ecc59-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:43:42.502883 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:42.502882 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a05835-34d1-47b6-a049-99a8222ecc59-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:43:43.111433 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:43.111403 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" event={"ID":"37a05835-34d1-47b6-a049-99a8222ecc59","Type":"ContainerDied","Data":"ed8ee80328f6a6e33b32eb22c27878b7984b6b90b3b45f85c97dc241bcef4b3f"} Apr 22 19:43:43.111433 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:43.111431 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" Apr 22 19:43:43.111642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:43.111437 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j" event={"ID":"37a05835-34d1-47b6-a049-99a8222ecc59","Type":"ContainerDied","Data":"dee2698fa1e2b2428062bc891e5c74b0cb089822ee520b83416dd79c9fed09d3"} Apr 22 19:43:43.139404 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:43.139375 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j"] Apr 22 19:43:43.142834 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:43.142815 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-22a3d-5498b7b956-qwg8j"] Apr 22 19:43:44.020235 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:44.020194 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:43:44.367716 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:44.367665 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" path="/var/lib/kubelet/pods/37a05835-34d1-47b6-a049-99a8222ecc59/volumes" Apr 22 19:43:45.107614 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:45.107581 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:43:46.381795 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.381760 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx"] Apr 22 19:43:46.382179 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.381953 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" containerID="cri-o://edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae" gracePeriod=30 Apr 22 19:43:46.477897 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.477862 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m"] Apr 22 19:43:46.478179 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.478152 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" containerID="cri-o://ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7" gracePeriod=30 Apr 22 19:43:46.529843 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.529814 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw"] Apr 22 19:43:46.533475 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.531159 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" Apr 22 19:43:46.533475 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.531195 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" Apr 22 19:43:46.533475 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.531374 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="37a05835-34d1-47b6-a049-99a8222ecc59" containerName="sequence-graph-22a3d" Apr 22 19:43:46.537062 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.537035 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" Apr 22 19:43:46.542272 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.542245 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw"] Apr 22 19:43:46.548088 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.548070 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" Apr 22 19:43:46.666333 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:46.666311 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw"] Apr 22 19:43:46.668243 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:43:46.668218 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod373bc30e_6149_48ca_a2be_a953653225a9.slice/crio-eea335a427c0b43be6505c0ba0901db666a9579616c0dd197121b3b2f9a43caa WatchSource:0}: Error finding container eea335a427c0b43be6505c0ba0901db666a9579616c0dd197121b3b2f9a43caa: Status 404 returned error can't find the container with id eea335a427c0b43be6505c0ba0901db666a9579616c0dd197121b3b2f9a43caa Apr 22 19:43:47.125266 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:47.125228 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" event={"ID":"373bc30e-6149-48ca-a2be-a953653225a9","Type":"ContainerStarted","Data":"5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677"} Apr 22 19:43:47.125266 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:47.125270 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" event={"ID":"373bc30e-6149-48ca-a2be-a953653225a9","Type":"ContainerStarted","Data":"eea335a427c0b43be6505c0ba0901db666a9579616c0dd197121b3b2f9a43caa"} Apr 22 19:43:47.125519 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:47.125433 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" Apr 22 19:43:47.126770 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:47.126746 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:43:47.142059 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:47.142010 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podStartSLOduration=1.141996751 podStartE2EDuration="1.141996751s" podCreationTimestamp="2026-04-22 19:43:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:47.139958751 +0000 UTC m=+1205.334153337" watchObservedRunningTime="2026-04-22 19:43:47.141996751 +0000 UTC m=+1205.336191334" Apr 22 19:43:47.877144 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:47.877108 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 19:43:48.128603 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:48.128517 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:43:49.719884 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:49.719860 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" Apr 22 19:43:50.106334 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.106298 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:50.134975 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.134946 2564 generic.go:358] "Generic (PLEG): container finished" podID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerID="ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7" exitCode=0 Apr 22 19:43:50.135149 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.134983 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" event={"ID":"5cf7052b-2101-44bd-9c75-b86e4b7314c1","Type":"ContainerDied","Data":"ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7"} Apr 22 19:43:50.135149 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.135007 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" event={"ID":"5cf7052b-2101-44bd-9c75-b86e4b7314c1","Type":"ContainerDied","Data":"d9f9c7ddb34c46dc3a6e64bf80c26120d4da4f4ddac7131bf3b2de1fef2df74b"} Apr 22 19:43:50.135149 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.135005 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m" Apr 22 19:43:50.135149 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.135086 2564 scope.go:117] "RemoveContainer" containerID="ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7" Apr 22 19:43:50.143160 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.143143 2564 scope.go:117] "RemoveContainer" containerID="ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7" Apr 22 19:43:50.143422 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:43:50.143402 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7\": container with ID starting with ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7 not found: ID does not exist" containerID="ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7" Apr 22 19:43:50.143468 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.143432 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7"} err="failed to get container status \"ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7\": rpc error: code = NotFound desc = could not find container \"ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7\": container with ID starting with ef6fa12b1fd66aac3fcc9aba1eeda44181cd288f7e6c7e20e32b0d9a2d0632b7 not found: ID does not exist" Apr 22 19:43:50.154617 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.154593 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m"] Apr 22 19:43:50.158147 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.158125 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0213e-predictor-6b5fb55459-7xg9m"] Apr 22 19:43:50.367996 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:50.367920 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" path="/var/lib/kubelet/pods/5cf7052b-2101-44bd-9c75-b86e4b7314c1/volumes" Apr 22 19:43:54.020598 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:54.020548 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:43:55.106079 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:55.106038 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:43:58.129487 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:43:58.129440 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:44:00.106460 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:00.106415 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:00.106907 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:00.106527 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:44:04.021030 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:04.020996 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" Apr 22 19:44:05.106202 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:05.106165 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:08.129488 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:08.129439 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:44:10.105628 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:10.105588 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:15.106001 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:15.105963 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:16.526434 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.526407 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:44:16.585766 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.585733 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-openshift-service-ca-bundle\") pod \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " Apr 22 19:44:16.585928 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.585792 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") pod \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\" (UID: \"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61\") " Apr 22 19:44:16.586113 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.586090 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" (UID: "9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:44:16.587742 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.587689 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" (UID: "9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:44:16.687498 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.687407 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:44:16.687498 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:16.687445 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:44:17.219996 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.219957 2564 generic.go:358] "Generic (PLEG): container finished" podID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerID="edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae" exitCode=0 Apr 22 19:44:17.220244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.220002 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" event={"ID":"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61","Type":"ContainerDied","Data":"edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae"} Apr 22 19:44:17.220244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.220027 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" Apr 22 19:44:17.220244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.220042 2564 scope.go:117] "RemoveContainer" containerID="edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae" Apr 22 19:44:17.220244 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.220030 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx" event={"ID":"9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61","Type":"ContainerDied","Data":"905166eb2fb642d61a04cc60dc36ad0820f616398b5684b5da67d3d430a54c5b"} Apr 22 19:44:17.227884 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.227865 2564 scope.go:117] "RemoveContainer" containerID="edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae" Apr 22 19:44:17.228129 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:44:17.228114 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae\": container with ID starting with edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae not found: ID does not exist" containerID="edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae" Apr 22 19:44:17.228192 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.228137 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae"} err="failed to get container status \"edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae\": rpc error: code = NotFound desc = could not find container \"edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae\": container with ID starting with edb315bdf79ded5cf0b999e7bb16f76b98225c68b91e769360fb4c06657822ae not found: ID does not exist" Apr 22 19:44:17.241479 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.241454 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx"] Apr 22 19:44:17.246791 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:17.246771 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0213e-9d9695dfb-m2mzx"] Apr 22 19:44:18.128829 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:18.128790 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:44:18.368078 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:18.368045 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" path="/var/lib/kubelet/pods/9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61/volumes" Apr 22 19:44:22.361672 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.361639 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k"] Apr 22 19:44:22.362063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.361997 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" Apr 22 19:44:22.362063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.362009 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" Apr 22 19:44:22.362063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.362028 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" Apr 22 19:44:22.362063 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.362033 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" Apr 22 19:44:22.362200 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.362082 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cf7052b-2101-44bd-9c75-b86e4b7314c1" containerName="kserve-container" Apr 22 19:44:22.362200 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.362093 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d2a5a4c-c2b9-4d78-ba2a-0c5cebeeed61" containerName="ensemble-graph-0213e" Apr 22 19:44:22.366590 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.366568 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.369782 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.369759 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3becf-kube-rbac-proxy-sar-config\"" Apr 22 19:44:22.369896 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.369766 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:44:22.370762 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.370747 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3becf-serving-cert\"" Apr 22 19:44:22.382079 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.382059 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k"] Apr 22 19:44:22.437319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.437286 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-openshift-service-ca-bundle\") pod \"sequence-graph-3becf-689b8c8984-dxw8k\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.437484 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.437333 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-proxy-tls\") pod \"sequence-graph-3becf-689b8c8984-dxw8k\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.537870 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.537843 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-openshift-service-ca-bundle\") pod \"sequence-graph-3becf-689b8c8984-dxw8k\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.538036 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.537904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-proxy-tls\") pod \"sequence-graph-3becf-689b8c8984-dxw8k\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.538542 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.538518 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-openshift-service-ca-bundle\") pod \"sequence-graph-3becf-689b8c8984-dxw8k\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.540573 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.540555 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-proxy-tls\") pod \"sequence-graph-3becf-689b8c8984-dxw8k\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.677076 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.676984 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:22.792039 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:22.792012 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k"] Apr 22 19:44:22.794308 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:44:22.794272 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-671c5d7883fa5e6e14c94ef4a2bb72d4245e2324d1ceb57acfb8a560b118d32a WatchSource:0}: Error finding container 671c5d7883fa5e6e14c94ef4a2bb72d4245e2324d1ceb57acfb8a560b118d32a: Status 404 returned error can't find the container with id 671c5d7883fa5e6e14c94ef4a2bb72d4245e2324d1ceb57acfb8a560b118d32a Apr 22 19:44:23.240982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:23.240948 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" event={"ID":"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd","Type":"ContainerStarted","Data":"38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442"} Apr 22 19:44:23.240982 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:23.240985 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" event={"ID":"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd","Type":"ContainerStarted","Data":"671c5d7883fa5e6e14c94ef4a2bb72d4245e2324d1ceb57acfb8a560b118d32a"} Apr 22 19:44:23.241211 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:23.241025 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:23.258997 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:23.258949 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podStartSLOduration=1.258934305 podStartE2EDuration="1.258934305s" podCreationTimestamp="2026-04-22 19:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:23.257862353 +0000 UTC m=+1241.452056941" watchObservedRunningTime="2026-04-22 19:44:23.258934305 +0000 UTC m=+1241.453128934" Apr 22 19:44:28.129046 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:28.129003 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:44:29.251146 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:29.251117 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:32.391417 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.391346 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k"] Apr 22 19:44:32.391857 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.391572 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" containerID="cri-o://38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442" gracePeriod=30 Apr 22 19:44:32.490799 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.490768 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h"] Apr 22 19:44:32.491003 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.490983 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" containerID="cri-o://54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529" gracePeriod=30 Apr 22 19:44:32.509362 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.509332 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75"] Apr 22 19:44:32.512772 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.512755 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" Apr 22 19:44:32.526570 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.526544 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" Apr 22 19:44:32.528940 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.528917 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75"] Apr 22 19:44:32.658207 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:32.658076 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75"] Apr 22 19:44:32.660752 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:44:32.660725 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57444f4_bd7f_44c3_b762_9278b7dc2927.slice/crio-ddac53f6fe285e257b5be32c288614f5853008f63dd3cba05ca13bb64f5ff3f0 WatchSource:0}: Error finding container ddac53f6fe285e257b5be32c288614f5853008f63dd3cba05ca13bb64f5ff3f0: Status 404 returned error can't find the container with id ddac53f6fe285e257b5be32c288614f5853008f63dd3cba05ca13bb64f5ff3f0 Apr 22 19:44:33.269841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:33.269805 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" event={"ID":"f57444f4-bd7f-44c3-b762-9278b7dc2927","Type":"ContainerStarted","Data":"78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d"} Apr 22 19:44:33.269841 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:33.269843 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" event={"ID":"f57444f4-bd7f-44c3-b762-9278b7dc2927","Type":"ContainerStarted","Data":"ddac53f6fe285e257b5be32c288614f5853008f63dd3cba05ca13bb64f5ff3f0"} Apr 22 19:44:33.270124 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:33.270104 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" Apr 22 19:44:33.271332 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:33.271307 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 19:44:33.289897 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:33.289845 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podStartSLOduration=1.289830031 podStartE2EDuration="1.289830031s" podCreationTimestamp="2026-04-22 19:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:33.287047519 +0000 UTC m=+1251.481242107" watchObservedRunningTime="2026-04-22 19:44:33.289830031 +0000 UTC m=+1251.484024620" Apr 22 19:44:34.020472 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:34.020425 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 19:44:34.248409 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:34.248360 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:34.273469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:34.273383 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 19:44:35.526309 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:35.526291 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" Apr 22 19:44:36.283642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.283556 2564 generic.go:358] "Generic (PLEG): container finished" podID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerID="54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529" exitCode=0 Apr 22 19:44:36.283642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.283616 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" Apr 22 19:44:36.283642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.283625 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" event={"ID":"678037b8-563b-4586-b25c-1d27ecdbfcaa","Type":"ContainerDied","Data":"54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529"} Apr 22 19:44:36.283868 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.283661 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h" event={"ID":"678037b8-563b-4586-b25c-1d27ecdbfcaa","Type":"ContainerDied","Data":"76bfb352ff34aa21da794c931310ad1f8580eee63e8443a6b9a10d4ecf69edaa"} Apr 22 19:44:36.283868 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.283681 2564 scope.go:117] "RemoveContainer" containerID="54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529" Apr 22 19:44:36.291750 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.291509 2564 scope.go:117] "RemoveContainer" containerID="54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529" Apr 22 19:44:36.291962 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:44:36.291938 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529\": container with ID starting with 54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529 not found: ID does not exist" containerID="54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529" Apr 22 19:44:36.292058 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.292031 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529"} err="failed to get container status \"54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529\": rpc error: code = NotFound desc = could not find container \"54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529\": container with ID starting with 54515dfd5dbfa049442a77c79266063c5e5f4cb6d604b03f84c6e8aa35c3b529 not found: ID does not exist" Apr 22 19:44:36.305504 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.305478 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h"] Apr 22 19:44:36.307894 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:44:36.307873 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod678037b8_563b_4586_b25c_1d27ecdbfcaa.slice/crio-76bfb352ff34aa21da794c931310ad1f8580eee63e8443a6b9a10d4ecf69edaa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod678037b8_563b_4586_b25c_1d27ecdbfcaa.slice\": RecentStats: unable to find data in memory cache]" Apr 22 19:44:36.309156 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.309139 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3becf-predictor-5fd4d7b59-qf97h"] Apr 22 19:44:36.367423 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:36.367395 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" path="/var/lib/kubelet/pods/678037b8-563b-4586-b25c-1d27ecdbfcaa/volumes" Apr 22 19:44:38.130314 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:38.130282 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" Apr 22 19:44:39.248282 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:39.248242 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:44.248500 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:44.248464 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:44.248913 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:44.248560 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:44:44.274216 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:44.274178 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 19:44:49.248822 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:49.248785 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:54.248191 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:54.248153 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:44:54.273791 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:54.273750 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 19:44:56.619605 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.619574 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267"] Apr 22 19:44:56.620018 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.619945 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" Apr 22 19:44:56.620018 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.619958 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" Apr 22 19:44:56.620018 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.620019 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="678037b8-563b-4586-b25c-1d27ecdbfcaa" containerName="kserve-container" Apr 22 19:44:56.624091 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.624074 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.626766 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.626735 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-62743-kube-rbac-proxy-sar-config\"" Apr 22 19:44:56.626766 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.626738 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-62743-serving-cert\"" Apr 22 19:44:56.630248 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.630219 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267"] Apr 22 19:44:56.736206 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.736173 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055d9073-bc43-43f7-8fc6-d3c781cf6c85-openshift-service-ca-bundle\") pod \"ensemble-graph-62743-7bf7944f9c-pj267\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.736382 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.736222 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/055d9073-bc43-43f7-8fc6-d3c781cf6c85-proxy-tls\") pod \"ensemble-graph-62743-7bf7944f9c-pj267\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.837021 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.836984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055d9073-bc43-43f7-8fc6-d3c781cf6c85-openshift-service-ca-bundle\") pod \"ensemble-graph-62743-7bf7944f9c-pj267\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.837164 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.837064 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/055d9073-bc43-43f7-8fc6-d3c781cf6c85-proxy-tls\") pod \"ensemble-graph-62743-7bf7944f9c-pj267\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.837584 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.837551 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055d9073-bc43-43f7-8fc6-d3c781cf6c85-openshift-service-ca-bundle\") pod \"ensemble-graph-62743-7bf7944f9c-pj267\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.839338 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.839320 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/055d9073-bc43-43f7-8fc6-d3c781cf6c85-proxy-tls\") pod \"ensemble-graph-62743-7bf7944f9c-pj267\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:56.934993 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:56.934889 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:57.052553 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:57.052522 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267"] Apr 22 19:44:57.351492 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:57.351455 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" event={"ID":"055d9073-bc43-43f7-8fc6-d3c781cf6c85","Type":"ContainerStarted","Data":"3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932"} Apr 22 19:44:57.351492 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:57.351494 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" event={"ID":"055d9073-bc43-43f7-8fc6-d3c781cf6c85","Type":"ContainerStarted","Data":"b97c249e4d3c777e00ba323e9d060875b04d2b7663b315b9c056abf239ab8b71"} Apr 22 19:44:57.351740 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:57.351660 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:44:57.369164 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:57.369118 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podStartSLOduration=1.369103967 podStartE2EDuration="1.369103967s" podCreationTimestamp="2026-04-22 19:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:57.367070622 +0000 UTC m=+1275.561265222" watchObservedRunningTime="2026-04-22 19:44:57.369103967 +0000 UTC m=+1275.563298554" Apr 22 19:44:59.248169 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:44:59.248132 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:45:02.412548 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:45:02.412510 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:45:02.412902 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:45:02.412569 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-671c5d7883fa5e6e14c94ef4a2bb72d4245e2324d1ceb57acfb8a560b118d32a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-conmon-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:45:02.412902 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:45:02.412508 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-conmon-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:45:02.412902 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:45:02.412734 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-conmon-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:45:02.412902 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:45:02.412736 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-conmon-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d649f08_d7aa_48fb_b43f_e4275ccbb4cd.slice/crio-38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:45:02.534564 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.534538 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:45:02.582886 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.582859 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-proxy-tls\") pod \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " Apr 22 19:45:02.583047 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.582901 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-openshift-service-ca-bundle\") pod \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\" (UID: \"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd\") " Apr 22 19:45:02.583283 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.583259 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" (UID: "4d649f08-d7aa-48fb-b43f-e4275ccbb4cd"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:45:02.584843 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.584824 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" (UID: "4d649f08-d7aa-48fb-b43f-e4275ccbb4cd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:45:02.683567 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.683462 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:45:02.683567 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:02.683513 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:45:03.361334 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.361302 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:45:03.371089 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.371060 2564 generic.go:358] "Generic (PLEG): container finished" podID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerID="38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442" exitCode=0 Apr 22 19:45:03.371233 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.371105 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" event={"ID":"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd","Type":"ContainerDied","Data":"38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442"} Apr 22 19:45:03.371233 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.371111 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" Apr 22 19:45:03.371233 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.371124 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k" event={"ID":"4d649f08-d7aa-48fb-b43f-e4275ccbb4cd","Type":"ContainerDied","Data":"671c5d7883fa5e6e14c94ef4a2bb72d4245e2324d1ceb57acfb8a560b118d32a"} Apr 22 19:45:03.371233 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.371141 2564 scope.go:117] "RemoveContainer" containerID="38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442" Apr 22 19:45:03.380327 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.380310 2564 scope.go:117] "RemoveContainer" containerID="38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442" Apr 22 19:45:03.380620 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:45:03.380592 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442\": container with ID starting with 38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442 not found: ID does not exist" containerID="38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442" Apr 22 19:45:03.380686 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.380628 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442"} err="failed to get container status \"38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442\": rpc error: code = NotFound desc = could not find container \"38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442\": container with ID starting with 38cb6c81cf32a00fc8742733f6effe78ec4193687b85e32aef843c0320e26442 not found: ID does not exist" Apr 22 19:45:03.400877 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.400853 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k"] Apr 22 19:45:03.405298 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:03.405277 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3becf-689b8c8984-dxw8k"] Apr 22 19:45:04.274473 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:04.274432 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 19:45:04.368540 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:04.368509 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" path="/var/lib/kubelet/pods/4d649f08-d7aa-48fb-b43f-e4275ccbb4cd/volumes" Apr 22 19:45:14.274319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:14.274267 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 19:45:24.275406 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:24.275370 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" Apr 22 19:45:42.606651 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.606608 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd"] Apr 22 19:45:42.607195 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.607181 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" Apr 22 19:45:42.607241 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.607197 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" Apr 22 19:45:42.607279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.607261 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d649f08-d7aa-48fb-b43f-e4275ccbb4cd" containerName="sequence-graph-3becf" Apr 22 19:45:42.610498 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.610483 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.612896 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.612871 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-7d5c8-kube-rbac-proxy-sar-config\"" Apr 22 19:45:42.613067 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.613049 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-7d5c8-serving-cert\"" Apr 22 19:45:42.616650 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.616630 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd"] Apr 22 19:45:42.729426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.729345 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-proxy-tls\") pod \"sequence-graph-7d5c8-5fbffdc9c5-g5xfd\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.729580 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.729480 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-openshift-service-ca-bundle\") pod \"sequence-graph-7d5c8-5fbffdc9c5-g5xfd\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.830173 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.830139 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-openshift-service-ca-bundle\") pod \"sequence-graph-7d5c8-5fbffdc9c5-g5xfd\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.830370 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.830197 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-proxy-tls\") pod \"sequence-graph-7d5c8-5fbffdc9c5-g5xfd\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.830821 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.830799 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-openshift-service-ca-bundle\") pod \"sequence-graph-7d5c8-5fbffdc9c5-g5xfd\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.832466 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.832449 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-proxy-tls\") pod \"sequence-graph-7d5c8-5fbffdc9c5-g5xfd\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:42.921270 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:42.921235 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:43.035308 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:43.035284 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd"] Apr 22 19:45:43.037918 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:45:43.037894 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbc2835a_9e3e_4899_b3cc_1667ea5c7b17.slice/crio-6eebbeb120d6dd0d36272ce7d3ee23b8dc26c60bc7543d40ead91dd6f5e86dd4 WatchSource:0}: Error finding container 6eebbeb120d6dd0d36272ce7d3ee23b8dc26c60bc7543d40ead91dd6f5e86dd4: Status 404 returned error can't find the container with id 6eebbeb120d6dd0d36272ce7d3ee23b8dc26c60bc7543d40ead91dd6f5e86dd4 Apr 22 19:45:43.488872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:43.488841 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" event={"ID":"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17","Type":"ContainerStarted","Data":"d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428"} Apr 22 19:45:43.488872 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:43.488877 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" event={"ID":"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17","Type":"ContainerStarted","Data":"6eebbeb120d6dd0d36272ce7d3ee23b8dc26c60bc7543d40ead91dd6f5e86dd4"} Apr 22 19:45:43.489089 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:43.488972 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:45:43.504426 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:43.504362 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podStartSLOduration=1.504347699 podStartE2EDuration="1.504347699s" podCreationTimestamp="2026-04-22 19:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:45:43.503181882 +0000 UTC m=+1321.697376468" watchObservedRunningTime="2026-04-22 19:45:43.504347699 +0000 UTC m=+1321.698542284" Apr 22 19:45:49.498150 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:45:49.498122 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:48:42.363464 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:48:42.363426 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:48:42.366192 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:48:42.366170 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:53:11.342681 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.342651 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267"] Apr 22 19:53:11.345083 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.342928 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" containerID="cri-o://3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932" gracePeriod=30 Apr 22 19:53:11.429090 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.429054 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw"] Apr 22 19:53:11.429317 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.429285 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" containerID="cri-o://5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677" gracePeriod=30 Apr 22 19:53:11.503582 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.503553 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8"] Apr 22 19:53:11.507239 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.507219 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" Apr 22 19:53:11.516823 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.516803 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" Apr 22 19:53:11.523409 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.523383 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8"] Apr 22 19:53:11.634061 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.634033 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8"] Apr 22 19:53:11.636837 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:53:11.636812 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3fc4e2_e666_4f3e_aa46_b53556f14131.slice/crio-48df24a371c9fbd00099b37376e7ec14ea0b6e273b31dd7d6b0b8142b1495272 WatchSource:0}: Error finding container 48df24a371c9fbd00099b37376e7ec14ea0b6e273b31dd7d6b0b8142b1495272: Status 404 returned error can't find the container with id 48df24a371c9fbd00099b37376e7ec14ea0b6e273b31dd7d6b0b8142b1495272 Apr 22 19:53:11.638953 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.638937 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:53:11.832727 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.832678 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" event={"ID":"ab3fc4e2-e666-4f3e-aa46-b53556f14131","Type":"ContainerStarted","Data":"11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9"} Apr 22 19:53:11.832727 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.832730 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" event={"ID":"ab3fc4e2-e666-4f3e-aa46-b53556f14131","Type":"ContainerStarted","Data":"48df24a371c9fbd00099b37376e7ec14ea0b6e273b31dd7d6b0b8142b1495272"} Apr 22 19:53:11.832923 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.832748 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" Apr 22 19:53:11.834191 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.834164 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:53:11.848236 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:11.848196 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podStartSLOduration=0.848182838 podStartE2EDuration="848.182838ms" podCreationTimestamp="2026-04-22 19:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:11.846873902 +0000 UTC m=+1770.041068490" watchObservedRunningTime="2026-04-22 19:53:11.848182838 +0000 UTC m=+1770.042377425" Apr 22 19:53:12.836319 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:12.836284 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:53:13.358945 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:13.358912 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:14.567554 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.567532 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" Apr 22 19:53:14.845448 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.845419 2564 generic.go:358] "Generic (PLEG): container finished" podID="373bc30e-6149-48ca-a2be-a953653225a9" containerID="5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677" exitCode=0 Apr 22 19:53:14.845614 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.845480 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" Apr 22 19:53:14.845614 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.845501 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" event={"ID":"373bc30e-6149-48ca-a2be-a953653225a9","Type":"ContainerDied","Data":"5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677"} Apr 22 19:53:14.845614 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.845537 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw" event={"ID":"373bc30e-6149-48ca-a2be-a953653225a9","Type":"ContainerDied","Data":"eea335a427c0b43be6505c0ba0901db666a9579616c0dd197121b3b2f9a43caa"} Apr 22 19:53:14.845614 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.845552 2564 scope.go:117] "RemoveContainer" containerID="5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677" Apr 22 19:53:14.853604 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.853590 2564 scope.go:117] "RemoveContainer" containerID="5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677" Apr 22 19:53:14.853860 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:53:14.853839 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677\": container with ID starting with 5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677 not found: ID does not exist" containerID="5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677" Apr 22 19:53:14.853945 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.853867 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677"} err="failed to get container status \"5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677\": rpc error: code = NotFound desc = could not find container \"5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677\": container with ID starting with 5a451abe35515118967c665bc8b688d27916e8872b1bff4c03345bfc83f88677 not found: ID does not exist" Apr 22 19:53:14.865607 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.865586 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw"] Apr 22 19:53:14.868461 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:14.868440 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-62743-predictor-56cb884845-x67tw"] Apr 22 19:53:16.367731 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:16.367676 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373bc30e-6149-48ca-a2be-a953653225a9" path="/var/lib/kubelet/pods/373bc30e-6149-48ca-a2be-a953653225a9/volumes" Apr 22 19:53:18.358669 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:18.358633 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:22.836837 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:22.836791 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:53:23.359112 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:23.359075 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:23.359279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:23.359181 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:53:28.359030 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:28.358986 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:32.836554 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:32.836466 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:53:33.358745 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:33.358688 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:38.358660 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:38.358624 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:41.490339 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.490310 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:53:41.662908 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.662818 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/055d9073-bc43-43f7-8fc6-d3c781cf6c85-proxy-tls\") pod \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " Apr 22 19:53:41.663077 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.662915 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055d9073-bc43-43f7-8fc6-d3c781cf6c85-openshift-service-ca-bundle\") pod \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\" (UID: \"055d9073-bc43-43f7-8fc6-d3c781cf6c85\") " Apr 22 19:53:41.663255 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.663232 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/055d9073-bc43-43f7-8fc6-d3c781cf6c85-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "055d9073-bc43-43f7-8fc6-d3c781cf6c85" (UID: "055d9073-bc43-43f7-8fc6-d3c781cf6c85"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:53:41.664942 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.664913 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055d9073-bc43-43f7-8fc6-d3c781cf6c85-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "055d9073-bc43-43f7-8fc6-d3c781cf6c85" (UID: "055d9073-bc43-43f7-8fc6-d3c781cf6c85"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:53:41.763668 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.763635 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055d9073-bc43-43f7-8fc6-d3c781cf6c85-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:53:41.763668 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.763667 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/055d9073-bc43-43f7-8fc6-d3c781cf6c85-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:53:41.929504 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.929421 2564 generic.go:358] "Generic (PLEG): container finished" podID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerID="3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932" exitCode=0 Apr 22 19:53:41.929504 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.929485 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" Apr 22 19:53:41.929673 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.929486 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" event={"ID":"055d9073-bc43-43f7-8fc6-d3c781cf6c85","Type":"ContainerDied","Data":"3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932"} Apr 22 19:53:41.929673 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.929585 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267" event={"ID":"055d9073-bc43-43f7-8fc6-d3c781cf6c85","Type":"ContainerDied","Data":"b97c249e4d3c777e00ba323e9d060875b04d2b7663b315b9c056abf239ab8b71"} Apr 22 19:53:41.929673 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.929599 2564 scope.go:117] "RemoveContainer" containerID="3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932" Apr 22 19:53:41.937726 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.937706 2564 scope.go:117] "RemoveContainer" containerID="3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932" Apr 22 19:53:41.938014 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:53:41.937992 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932\": container with ID starting with 3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932 not found: ID does not exist" containerID="3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932" Apr 22 19:53:41.938064 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.938024 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932"} err="failed to get container status \"3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932\": rpc error: code = NotFound desc = could not find container \"3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932\": container with ID starting with 3a637edb8c545a1575f43fd314adcf8fb024671f8e068b2b54a20ef0bea57932 not found: ID does not exist" Apr 22 19:53:41.950283 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.950263 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267"] Apr 22 19:53:41.952974 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:41.952954 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-62743-7bf7944f9c-pj267"] Apr 22 19:53:42.367377 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:42.367344 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" path="/var/lib/kubelet/pods/055d9073-bc43-43f7-8fc6-d3c781cf6c85/volumes" Apr 22 19:53:42.387258 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:42.387233 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:53:42.390597 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:42.390573 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:53:42.836432 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:42.836396 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:53:52.837132 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:52.837092 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:53:57.299728 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.299670 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd"] Apr 22 19:53:57.300164 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.299924 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" containerID="cri-o://d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428" gracePeriod=30 Apr 22 19:53:57.392190 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.392154 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75"] Apr 22 19:53:57.433857 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.433828 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4"] Apr 22 19:53:57.434177 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.434165 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" Apr 22 19:53:57.434230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.434178 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" Apr 22 19:53:57.434230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.434203 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" Apr 22 19:53:57.434230 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.434209 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" Apr 22 19:53:57.434326 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.434253 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="373bc30e-6149-48ca-a2be-a953653225a9" containerName="kserve-container" Apr 22 19:53:57.434326 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.434262 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="055d9073-bc43-43f7-8fc6-d3c781cf6c85" containerName="ensemble-graph-62743" Apr 22 19:53:57.438728 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.438706 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" Apr 22 19:53:57.443968 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.443942 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4"] Apr 22 19:53:57.448889 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.448869 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" Apr 22 19:53:57.568835 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.568806 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4"] Apr 22 19:53:57.571283 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:53:57.571257 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12c4ff3_d289_4622_a303_df834301b56f.slice/crio-8a17fc691a5f4e58553adfc2210291af072e7f7c0ed37bb4a535be8cd2a12926 WatchSource:0}: Error finding container 8a17fc691a5f4e58553adfc2210291af072e7f7c0ed37bb4a535be8cd2a12926: Status 404 returned error can't find the container with id 8a17fc691a5f4e58553adfc2210291af072e7f7c0ed37bb4a535be8cd2a12926 Apr 22 19:53:57.983662 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.983571 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" event={"ID":"e12c4ff3-d289-4622-a303-df834301b56f","Type":"ContainerStarted","Data":"b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805"} Apr 22 19:53:57.983662 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.983611 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" event={"ID":"e12c4ff3-d289-4622-a303-df834301b56f","Type":"ContainerStarted","Data":"8a17fc691a5f4e58553adfc2210291af072e7f7c0ed37bb4a535be8cd2a12926"} Apr 22 19:53:57.983895 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:57.983668 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" containerID="cri-o://78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d" gracePeriod=30 Apr 22 19:53:58.003953 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:58.003474 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podStartSLOduration=1.003457256 podStartE2EDuration="1.003457256s" podCreationTimestamp="2026-04-22 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:58.000943728 +0000 UTC m=+1816.195138339" watchObservedRunningTime="2026-04-22 19:53:58.003457256 +0000 UTC m=+1816.197651844" Apr 22 19:53:58.987172 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:58.987139 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" Apr 22 19:53:58.988520 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:58.988487 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:53:59.496736 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:59.496673 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:53:59.990536 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:53:59.990496 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:54:01.127414 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:01.127392 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" Apr 22 19:54:01.997954 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:01.997923 2564 generic.go:358] "Generic (PLEG): container finished" podID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerID="78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d" exitCode=0 Apr 22 19:54:01.998125 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:01.997981 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" Apr 22 19:54:01.998125 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:01.998016 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" event={"ID":"f57444f4-bd7f-44c3-b762-9278b7dc2927","Type":"ContainerDied","Data":"78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d"} Apr 22 19:54:01.998125 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:01.998058 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75" event={"ID":"f57444f4-bd7f-44c3-b762-9278b7dc2927","Type":"ContainerDied","Data":"ddac53f6fe285e257b5be32c288614f5853008f63dd3cba05ca13bb64f5ff3f0"} Apr 22 19:54:01.998125 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:01.998080 2564 scope.go:117] "RemoveContainer" containerID="78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d" Apr 22 19:54:02.005807 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:02.005790 2564 scope.go:117] "RemoveContainer" containerID="78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d" Apr 22 19:54:02.006066 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:54:02.006042 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d\": container with ID starting with 78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d not found: ID does not exist" containerID="78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d" Apr 22 19:54:02.006123 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:02.006079 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d"} err="failed to get container status \"78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d\": rpc error: code = NotFound desc = could not find container \"78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d\": container with ID starting with 78cd8c403743125f203dfc97a67768f2e4ef1eb71e570143a960d6d82321a88d not found: ID does not exist" Apr 22 19:54:02.018763 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:02.018736 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75"] Apr 22 19:54:02.022357 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:02.022337 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7d5c8-predictor-db8d55f47-79f75"] Apr 22 19:54:02.371837 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:02.368738 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" path="/var/lib/kubelet/pods/f57444f4-bd7f-44c3-b762-9278b7dc2927/volumes" Apr 22 19:54:02.837873 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:02.837844 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" Apr 22 19:54:04.496642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:04.496604 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:09.496787 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:09.496747 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:09.497196 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:09.496859 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:54:09.991225 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:09.991176 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:54:14.497274 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:14.497230 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:19.497424 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:19.497384 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:19.991113 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:19.991071 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:54:21.571107 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.571030 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk"] Apr 22 19:54:21.571431 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.571372 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" Apr 22 19:54:21.571431 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.571384 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" Apr 22 19:54:21.571512 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.571466 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f57444f4-bd7f-44c3-b762-9278b7dc2927" containerName="kserve-container" Apr 22 19:54:21.574524 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.574509 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.576886 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.576858 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-44fda-kube-rbac-proxy-sar-config\"" Apr 22 19:54:21.576993 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.576925 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-44fda-serving-cert\"" Apr 22 19:54:21.584173 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.584149 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk"] Apr 22 19:54:21.650954 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.650920 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-proxy-tls\") pod \"splitter-graph-44fda-85c85c6697-bn7lk\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.651125 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.651018 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-openshift-service-ca-bundle\") pod \"splitter-graph-44fda-85c85c6697-bn7lk\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.752011 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.751976 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-proxy-tls\") pod \"splitter-graph-44fda-85c85c6697-bn7lk\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.752180 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.752037 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-openshift-service-ca-bundle\") pod \"splitter-graph-44fda-85c85c6697-bn7lk\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.752647 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.752629 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-openshift-service-ca-bundle\") pod \"splitter-graph-44fda-85c85c6697-bn7lk\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.754337 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.754308 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-proxy-tls\") pod \"splitter-graph-44fda-85c85c6697-bn7lk\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:21.884649 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:21.884560 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:22.006508 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:54:22.006476 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de03c54_1d6d_4e7c_aed1_8e60f60f4e18.slice/crio-b3e0c07eac9750de220ea8ba8feeb7b5dcaed45b31e26dff6df8ea41c0031405 WatchSource:0}: Error finding container b3e0c07eac9750de220ea8ba8feeb7b5dcaed45b31e26dff6df8ea41c0031405: Status 404 returned error can't find the container with id b3e0c07eac9750de220ea8ba8feeb7b5dcaed45b31e26dff6df8ea41c0031405 Apr 22 19:54:22.007597 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:22.007568 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk"] Apr 22 19:54:22.061234 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:22.061204 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" event={"ID":"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18","Type":"ContainerStarted","Data":"b3e0c07eac9750de220ea8ba8feeb7b5dcaed45b31e26dff6df8ea41c0031405"} Apr 22 19:54:23.065411 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:23.065377 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" event={"ID":"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18","Type":"ContainerStarted","Data":"f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534"} Apr 22 19:54:23.065797 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:23.065488 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:23.081753 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:23.081710 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podStartSLOduration=2.081679498 podStartE2EDuration="2.081679498s" podCreationTimestamp="2026-04-22 19:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:54:23.080266033 +0000 UTC m=+1841.274460621" watchObservedRunningTime="2026-04-22 19:54:23.081679498 +0000 UTC m=+1841.275874085" Apr 22 19:54:24.496565 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:24.496526 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:27.448024 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.448000 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:54:27.500157 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.500127 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-proxy-tls\") pod \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " Apr 22 19:54:27.500311 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.500225 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-openshift-service-ca-bundle\") pod \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\" (UID: \"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17\") " Apr 22 19:54:27.500669 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.500638 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" (UID: "bbc2835a-9e3e-4899-b3cc-1667ea5c7b17"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:54:27.502328 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.502298 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" (UID: "bbc2835a-9e3e-4899-b3cc-1667ea5c7b17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:54:27.601059 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.600976 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:54:27.601059 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:27.601006 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:54:28.081131 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.081094 2564 generic.go:358] "Generic (PLEG): container finished" podID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerID="d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428" exitCode=0 Apr 22 19:54:28.081297 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.081150 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" event={"ID":"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17","Type":"ContainerDied","Data":"d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428"} Apr 22 19:54:28.081297 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.081160 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" Apr 22 19:54:28.081297 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.081180 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd" event={"ID":"bbc2835a-9e3e-4899-b3cc-1667ea5c7b17","Type":"ContainerDied","Data":"6eebbeb120d6dd0d36272ce7d3ee23b8dc26c60bc7543d40ead91dd6f5e86dd4"} Apr 22 19:54:28.081297 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.081200 2564 scope.go:117] "RemoveContainer" containerID="d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428" Apr 22 19:54:28.089787 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.089762 2564 scope.go:117] "RemoveContainer" containerID="d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428" Apr 22 19:54:28.090117 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:54:28.090088 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428\": container with ID starting with d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428 not found: ID does not exist" containerID="d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428" Apr 22 19:54:28.090214 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.090127 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428"} err="failed to get container status \"d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428\": rpc error: code = NotFound desc = could not find container \"d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428\": container with ID starting with d27a5409ee97aacef71f93a3ee724feafba226042e1462248b1a77d06d139428 not found: ID does not exist" Apr 22 19:54:28.101127 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.101098 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd"] Apr 22 19:54:28.104739 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.104712 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7d5c8-5fbffdc9c5-g5xfd"] Apr 22 19:54:28.367600 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:28.367526 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" path="/var/lib/kubelet/pods/bbc2835a-9e3e-4899-b3cc-1667ea5c7b17/volumes" Apr 22 19:54:29.074292 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:29.074267 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:29.991625 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:29.991558 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:54:31.651769 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.651738 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk"] Apr 22 19:54:31.652226 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.651975 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" containerID="cri-o://f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534" gracePeriod=30 Apr 22 19:54:31.778431 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.778401 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc"] Apr 22 19:54:31.778842 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.778828 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" Apr 22 19:54:31.778890 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.778843 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" Apr 22 19:54:31.778924 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.778914 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbc2835a-9e3e-4899-b3cc-1667ea5c7b17" containerName="sequence-graph-7d5c8" Apr 22 19:54:31.784866 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.784837 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" Apr 22 19:54:31.787899 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.787874 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8"] Apr 22 19:54:31.788093 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.788070 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" containerID="cri-o://11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9" gracePeriod=30 Apr 22 19:54:31.791632 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.791603 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc"] Apr 22 19:54:31.796546 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.796526 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" Apr 22 19:54:31.921839 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:31.921817 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc"] Apr 22 19:54:31.924312 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:54:31.924284 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b998a1e_04b5_400d_9382_8f9ce90e7c1b.slice/crio-c1761aef2940445730a72d03f1e0b474dc9c1c5bc2fa2658576532a89321d3c5 WatchSource:0}: Error finding container c1761aef2940445730a72d03f1e0b474dc9c1c5bc2fa2658576532a89321d3c5: Status 404 returned error can't find the container with id c1761aef2940445730a72d03f1e0b474dc9c1c5bc2fa2658576532a89321d3c5 Apr 22 19:54:32.096046 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:32.096004 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" event={"ID":"3b998a1e-04b5-400d-9382-8f9ce90e7c1b","Type":"ContainerStarted","Data":"654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173"} Apr 22 19:54:32.096242 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:32.096054 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" event={"ID":"3b998a1e-04b5-400d-9382-8f9ce90e7c1b","Type":"ContainerStarted","Data":"c1761aef2940445730a72d03f1e0b474dc9c1c5bc2fa2658576532a89321d3c5"} Apr 22 19:54:32.096242 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:32.096220 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" Apr 22 19:54:32.097212 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:32.097174 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 19:54:32.137631 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:32.137583 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podStartSLOduration=1.137567988 podStartE2EDuration="1.137567988s" podCreationTimestamp="2026-04-22 19:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:54:32.132780355 +0000 UTC m=+1850.326974964" watchObservedRunningTime="2026-04-22 19:54:32.137567988 +0000 UTC m=+1850.331762574" Apr 22 19:54:32.836770 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:32.836723 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 19:54:33.099683 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:33.099591 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 19:54:34.073093 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:34.073056 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:34.934391 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:34.934370 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" Apr 22 19:54:35.108012 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.107974 2564 generic.go:358] "Generic (PLEG): container finished" podID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerID="11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9" exitCode=0 Apr 22 19:54:35.108420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.108032 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" Apr 22 19:54:35.108420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.108054 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" event={"ID":"ab3fc4e2-e666-4f3e-aa46-b53556f14131","Type":"ContainerDied","Data":"11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9"} Apr 22 19:54:35.108420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.108087 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8" event={"ID":"ab3fc4e2-e666-4f3e-aa46-b53556f14131","Type":"ContainerDied","Data":"48df24a371c9fbd00099b37376e7ec14ea0b6e273b31dd7d6b0b8142b1495272"} Apr 22 19:54:35.108420 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.108102 2564 scope.go:117] "RemoveContainer" containerID="11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9" Apr 22 19:54:35.115621 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.115604 2564 scope.go:117] "RemoveContainer" containerID="11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9" Apr 22 19:54:35.115908 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:54:35.115891 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9\": container with ID starting with 11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9 not found: ID does not exist" containerID="11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9" Apr 22 19:54:35.115972 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.115916 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9"} err="failed to get container status \"11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9\": rpc error: code = NotFound desc = could not find container \"11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9\": container with ID starting with 11a40ceb10133f0a429fcc72d4fe1e7bda36a37ca7ad43d37b6ae632839c49a9 not found: ID does not exist" Apr 22 19:54:35.127329 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.127305 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8"] Apr 22 19:54:35.129240 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:35.129221 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44fda-predictor-7b4b7f6c87-bs9s8"] Apr 22 19:54:36.367443 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:36.367410 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" path="/var/lib/kubelet/pods/ab3fc4e2-e666-4f3e-aa46-b53556f14131/volumes" Apr 22 19:54:39.073351 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:39.073313 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:39.991285 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:39.991239 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 19:54:43.100051 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:43.100005 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 19:54:44.073610 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:44.073568 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:44.073802 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:44.073706 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:54:49.073040 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:49.072998 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:49.992415 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:49.992384 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" Apr 22 19:54:53.100414 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:53.100363 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 19:54:54.074288 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:54.074243 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:54:59.072950 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:54:59.072906 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:55:01.680108 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:01.680081 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de03c54_1d6d_4e7c_aed1_8e60f60f4e18.slice/crio-conmon-f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:55:01.680387 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:01.680131 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de03c54_1d6d_4e7c_aed1_8e60f60f4e18.slice/crio-conmon-f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:55:02.190818 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.190782 2564 generic.go:358] "Generic (PLEG): container finished" podID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerID="f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534" exitCode=0 Apr 22 19:55:02.191008 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.190850 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" event={"ID":"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18","Type":"ContainerDied","Data":"f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534"} Apr 22 19:55:02.289853 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.289830 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:55:02.311817 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.311789 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-proxy-tls\") pod \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " Apr 22 19:55:02.311955 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.311843 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-openshift-service-ca-bundle\") pod \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\" (UID: \"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18\") " Apr 22 19:55:02.312195 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.312175 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" (UID: "4de03c54-1d6d-4e7c-aed1-8e60f60f4e18"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:55:02.313791 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.313765 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" (UID: "4de03c54-1d6d-4e7c-aed1-8e60f60f4e18"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:55:02.413027 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.412989 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:55:02.413027 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:02.413019 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 19:55:03.100035 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:03.099991 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 19:55:03.195279 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:03.195246 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" event={"ID":"4de03c54-1d6d-4e7c-aed1-8e60f60f4e18","Type":"ContainerDied","Data":"b3e0c07eac9750de220ea8ba8feeb7b5dcaed45b31e26dff6df8ea41c0031405"} Apr 22 19:55:03.195462 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:03.195304 2564 scope.go:117] "RemoveContainer" containerID="f8a5c2ae4294c055b9197879f22d370a708b28e76a23884b9881e29b618a3534" Apr 22 19:55:03.195462 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:03.195311 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk" Apr 22 19:55:03.211332 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:03.211307 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk"] Apr 22 19:55:03.215154 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:03.215131 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-44fda-85c85c6697-bn7lk"] Apr 22 19:55:04.368202 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:04.368173 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" path="/var/lib/kubelet/pods/4de03c54-1d6d-4e7c-aed1-8e60f60f4e18/volumes" Apr 22 19:55:07.526011 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.525982 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk"] Apr 22 19:55:07.526408 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.526387 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" Apr 22 19:55:07.526408 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.526408 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" Apr 22 19:55:07.526503 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.526426 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" Apr 22 19:55:07.526503 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.526433 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" Apr 22 19:55:07.526566 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.526510 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4de03c54-1d6d-4e7c-aed1-8e60f60f4e18" containerName="splitter-graph-44fda" Apr 22 19:55:07.526566 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.526523 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab3fc4e2-e666-4f3e-aa46-b53556f14131" containerName="kserve-container" Apr 22 19:55:07.530668 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.530653 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:07.533245 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.533222 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-65828-kube-rbac-proxy-sar-config\"" Apr 22 19:55:07.533245 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.533230 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-65828-serving-cert\"" Apr 22 19:55:07.533491 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.533262 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:55:07.535783 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.535765 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk"] Apr 22 19:55:07.556642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.556609 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:07.556642 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.556642 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-openshift-service-ca-bundle\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:07.657392 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.657357 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:07.657392 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.657399 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-openshift-service-ca-bundle\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:07.657611 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:07.657451 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-65828-serving-cert: secret "switch-graph-65828-serving-cert" not found Apr 22 19:55:07.657611 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:07.657519 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls podName:a6d78a95-b6d5-4c2a-a1cc-4d9b25743619 nodeName:}" failed. No retries permitted until 2026-04-22 19:55:08.157504032 +0000 UTC m=+1886.351698597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls") pod "switch-graph-65828-75747dbbc6-px2gk" (UID: "a6d78a95-b6d5-4c2a-a1cc-4d9b25743619") : secret "switch-graph-65828-serving-cert" not found Apr 22 19:55:07.658034 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:07.658014 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-openshift-service-ca-bundle\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:08.162055 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:08.162022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:08.162231 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:08.162136 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-65828-serving-cert: secret "switch-graph-65828-serving-cert" not found Apr 22 19:55:08.162231 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:08.162186 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls podName:a6d78a95-b6d5-4c2a-a1cc-4d9b25743619 nodeName:}" failed. No retries permitted until 2026-04-22 19:55:09.162172786 +0000 UTC m=+1887.356367352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls") pod "switch-graph-65828-75747dbbc6-px2gk" (UID: "a6d78a95-b6d5-4c2a-a1cc-4d9b25743619") : secret "switch-graph-65828-serving-cert" not found Apr 22 19:55:09.171688 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:09.171644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:09.174109 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:09.174084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") pod \"switch-graph-65828-75747dbbc6-px2gk\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:09.342447 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:09.342412 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:09.458130 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:09.457961 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk"] Apr 22 19:55:09.460679 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:55:09.460655 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d78a95_b6d5_4c2a_a1cc_4d9b25743619.slice/crio-4e38166858de350408f73beed41d593b1feac094ea4fca497e3c51220827ac0c WatchSource:0}: Error finding container 4e38166858de350408f73beed41d593b1feac094ea4fca497e3c51220827ac0c: Status 404 returned error can't find the container with id 4e38166858de350408f73beed41d593b1feac094ea4fca497e3c51220827ac0c Apr 22 19:55:10.220736 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:10.220682 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" event={"ID":"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619","Type":"ContainerStarted","Data":"1dc475452cb72e660b621ef5aa9b7a069abd4fea591fb2d5b8bb8e87d69f6c2a"} Apr 22 19:55:10.220736 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:10.220738 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" event={"ID":"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619","Type":"ContainerStarted","Data":"4e38166858de350408f73beed41d593b1feac094ea4fca497e3c51220827ac0c"} Apr 22 19:55:10.221208 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:10.220813 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:10.237939 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:10.237895 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podStartSLOduration=3.237877682 podStartE2EDuration="3.237877682s" podCreationTimestamp="2026-04-22 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:55:10.237433572 +0000 UTC m=+1888.431628211" watchObservedRunningTime="2026-04-22 19:55:10.237877682 +0000 UTC m=+1888.432072269" Apr 22 19:55:13.100061 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:13.100024 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 19:55:16.229762 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:16.229736 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 19:55:23.101469 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:23.101441 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" Apr 22 19:55:41.862778 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.862742 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt"] Apr 22 19:55:41.866168 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.866151 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:41.870737 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.870716 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-0c611-serving-cert\"" Apr 22 19:55:41.870819 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.870718 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-0c611-kube-rbac-proxy-sar-config\"" Apr 22 19:55:41.873663 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.873638 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt"] Apr 22 19:55:41.955747 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.955687 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f970de-fcb1-488a-9b4a-84f61a83a17b-openshift-service-ca-bundle\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:41.955915 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:41.955755 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.057017 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.056979 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f970de-fcb1-488a-9b4a-84f61a83a17b-openshift-service-ca-bundle\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.057017 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.057022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.057219 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:42.057113 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-0c611-serving-cert: secret "splitter-graph-0c611-serving-cert" not found Apr 22 19:55:42.057219 ip-10-0-143-198 kubenswrapper[2564]: E0422 19:55:42.057174 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls podName:79f970de-fcb1-488a-9b4a-84f61a83a17b nodeName:}" failed. No retries permitted until 2026-04-22 19:55:42.557157778 +0000 UTC m=+1920.751352344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls") pod "splitter-graph-0c611-7c9746c8d4-fmcpt" (UID: "79f970de-fcb1-488a-9b4a-84f61a83a17b") : secret "splitter-graph-0c611-serving-cert" not found Apr 22 19:55:42.057558 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.057538 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f970de-fcb1-488a-9b4a-84f61a83a17b-openshift-service-ca-bundle\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.560966 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.560934 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.563212 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.563190 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls\") pod \"splitter-graph-0c611-7c9746c8d4-fmcpt\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.777752 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.777639 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:42.894388 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:42.894356 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt"] Apr 22 19:55:42.897158 ip-10-0-143-198 kubenswrapper[2564]: W0422 19:55:42.897130 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f970de_fcb1_488a_9b4a_84f61a83a17b.slice/crio-c6676f5111e8aea3bb9d7892bfc346473bea032b7094d236e6e488c14f8f1398 WatchSource:0}: Error finding container c6676f5111e8aea3bb9d7892bfc346473bea032b7094d236e6e488c14f8f1398: Status 404 returned error can't find the container with id c6676f5111e8aea3bb9d7892bfc346473bea032b7094d236e6e488c14f8f1398 Apr 22 19:55:43.321163 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:43.321130 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" event={"ID":"79f970de-fcb1-488a-9b4a-84f61a83a17b","Type":"ContainerStarted","Data":"58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9"} Apr 22 19:55:43.321163 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:43.321165 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" event={"ID":"79f970de-fcb1-488a-9b4a-84f61a83a17b","Type":"ContainerStarted","Data":"c6676f5111e8aea3bb9d7892bfc346473bea032b7094d236e6e488c14f8f1398"} Apr 22 19:55:43.321371 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:43.321237 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:55:43.338313 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:43.338272 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podStartSLOduration=2.338259082 podStartE2EDuration="2.338259082s" podCreationTimestamp="2026-04-22 19:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:55:43.336758826 +0000 UTC m=+1921.530953417" watchObservedRunningTime="2026-04-22 19:55:43.338259082 +0000 UTC m=+1921.532453732" Apr 22 19:55:49.329850 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:55:49.329818 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 19:58:42.410417 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:58:42.410383 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 19:58:42.414377 ip-10-0-143-198 kubenswrapper[2564]: I0422 19:58:42.414352 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 20:03:42.433452 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:42.433418 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 20:03:42.437197 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:42.437172 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 20:03:56.646769 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:56.646735 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt"] Apr 22 20:03:56.647330 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:56.646989 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" containerID="cri-o://58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9" gracePeriod=30 Apr 22 20:03:56.724344 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:56.724312 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc"] Apr 22 20:03:56.724552 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:56.724531 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" containerID="cri-o://654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173" gracePeriod=30 Apr 22 20:03:59.328471 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.328414 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:03:59.762763 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.762739 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" Apr 22 20:03:59.795958 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.795929 2564 generic.go:358] "Generic (PLEG): container finished" podID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerID="654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173" exitCode=0 Apr 22 20:03:59.796112 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.795976 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" event={"ID":"3b998a1e-04b5-400d-9382-8f9ce90e7c1b","Type":"ContainerDied","Data":"654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173"} Apr 22 20:03:59.796112 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.795987 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" Apr 22 20:03:59.796112 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.795997 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc" event={"ID":"3b998a1e-04b5-400d-9382-8f9ce90e7c1b","Type":"ContainerDied","Data":"c1761aef2940445730a72d03f1e0b474dc9c1c5bc2fa2658576532a89321d3c5"} Apr 22 20:03:59.796112 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.796021 2564 scope.go:117] "RemoveContainer" containerID="654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173" Apr 22 20:03:59.803780 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.803758 2564 scope.go:117] "RemoveContainer" containerID="654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173" Apr 22 20:03:59.804133 ip-10-0-143-198 kubenswrapper[2564]: E0422 20:03:59.804002 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173\": container with ID starting with 654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173 not found: ID does not exist" containerID="654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173" Apr 22 20:03:59.804133 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.804034 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173"} err="failed to get container status \"654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173\": rpc error: code = NotFound desc = could not find container \"654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173\": container with ID starting with 654ff5ee1e21a7495d26c04ecc9fab9c26160251a0664072f25bab07211a4173 not found: ID does not exist" Apr 22 20:03:59.816566 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.816540 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc"] Apr 22 20:03:59.820215 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:03:59.820196 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c611-predictor-55867b6f47-7ssmc"] Apr 22 20:04:00.367940 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:00.367900 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" path="/var/lib/kubelet/pods/3b998a1e-04b5-400d-9382-8f9ce90e7c1b/volumes" Apr 22 20:04:04.328488 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:04.328389 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:04:09.328768 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:09.328727 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:04:09.329128 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:09.328823 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 20:04:14.328783 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:14.328737 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:04:19.328277 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:19.328240 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:04:24.328628 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:24.328578 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:04:26.788991 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.788967 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 20:04:26.877476 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.877432 2564 generic.go:358] "Generic (PLEG): container finished" podID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerID="58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9" exitCode=0 Apr 22 20:04:26.877662 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.877500 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" Apr 22 20:04:26.877662 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.877518 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" event={"ID":"79f970de-fcb1-488a-9b4a-84f61a83a17b","Type":"ContainerDied","Data":"58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9"} Apr 22 20:04:26.877662 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.877553 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt" event={"ID":"79f970de-fcb1-488a-9b4a-84f61a83a17b","Type":"ContainerDied","Data":"c6676f5111e8aea3bb9d7892bfc346473bea032b7094d236e6e488c14f8f1398"} Apr 22 20:04:26.877662 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.877567 2564 scope.go:117] "RemoveContainer" containerID="58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9" Apr 22 20:04:26.885166 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.885152 2564 scope.go:117] "RemoveContainer" containerID="58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9" Apr 22 20:04:26.885387 ip-10-0-143-198 kubenswrapper[2564]: E0422 20:04:26.885373 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9\": container with ID starting with 58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9 not found: ID does not exist" containerID="58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9" Apr 22 20:04:26.885424 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.885396 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9"} err="failed to get container status \"58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9\": rpc error: code = NotFound desc = could not find container \"58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9\": container with ID starting with 58cf7380374f12a39d750a65d6dff5da778fbdd3b4b58e7ea73c7d54a72d1fb9 not found: ID does not exist" Apr 22 20:04:26.959474 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.959393 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls\") pod \"79f970de-fcb1-488a-9b4a-84f61a83a17b\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " Apr 22 20:04:26.959611 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.959512 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f970de-fcb1-488a-9b4a-84f61a83a17b-openshift-service-ca-bundle\") pod \"79f970de-fcb1-488a-9b4a-84f61a83a17b\" (UID: \"79f970de-fcb1-488a-9b4a-84f61a83a17b\") " Apr 22 20:04:26.959903 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.959871 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f970de-fcb1-488a-9b4a-84f61a83a17b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "79f970de-fcb1-488a-9b4a-84f61a83a17b" (UID: "79f970de-fcb1-488a-9b4a-84f61a83a17b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:04:26.961523 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:26.961507 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "79f970de-fcb1-488a-9b4a-84f61a83a17b" (UID: "79f970de-fcb1-488a-9b4a-84f61a83a17b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:04:27.060469 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:27.060431 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79f970de-fcb1-488a-9b4a-84f61a83a17b-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 20:04:27.060469 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:27.060466 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f970de-fcb1-488a-9b4a-84f61a83a17b-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 20:04:27.198031 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:27.198000 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt"] Apr 22 20:04:27.201610 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:27.201586 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c611-7c9746c8d4-fmcpt"] Apr 22 20:04:28.367766 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:04:28.367729 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" path="/var/lib/kubelet/pods/79f970de-fcb1-488a-9b4a-84f61a83a17b/volumes" Apr 22 20:08:42.458259 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:08:42.458145 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 20:08:42.463822 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:08:42.463806 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 20:11:26.795652 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:26.795621 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk"] Apr 22 20:11:26.796161 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:26.795885 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" containerID="cri-o://1dc475452cb72e660b621ef5aa9b7a069abd4fea591fb2d5b8bb8e87d69f6c2a" gracePeriod=30 Apr 22 20:11:26.896903 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:26.896870 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4"] Apr 22 20:11:26.897146 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:26.897106 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" containerID="cri-o://b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805" gracePeriod=30 Apr 22 20:11:28.089625 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.089586 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qsd4b/must-gather-xcpdj"] Apr 22 20:11:28.090025 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.089934 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" Apr 22 20:11:28.090025 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.089945 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" Apr 22 20:11:28.090025 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.089967 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" Apr 22 20:11:28.090025 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.089973 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" Apr 22 20:11:28.090025 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.090021 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b998a1e-04b5-400d-9382-8f9ce90e7c1b" containerName="kserve-container" Apr 22 20:11:28.090025 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.090030 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="79f970de-fcb1-488a-9b4a-84f61a83a17b" containerName="splitter-graph-0c611" Apr 22 20:11:28.092993 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.092978 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.095365 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.095341 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qsd4b\"/\"default-dockercfg-jhspw\"" Apr 22 20:11:28.096465 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.096447 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsd4b\"/\"openshift-service-ca.crt\"" Apr 22 20:11:28.096566 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.096468 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsd4b\"/\"kube-root-ca.crt\"" Apr 22 20:11:28.107845 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.107821 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsd4b/must-gather-xcpdj"] Apr 22 20:11:28.211145 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.211113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcm82\" (UniqueName: \"kubernetes.io/projected/78641fbc-1d1b-4401-96cb-2917743b32ec-kube-api-access-dcm82\") pod \"must-gather-xcpdj\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.211337 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.211166 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/78641fbc-1d1b-4401-96cb-2917743b32ec-must-gather-output\") pod \"must-gather-xcpdj\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.311638 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.311602 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcm82\" (UniqueName: \"kubernetes.io/projected/78641fbc-1d1b-4401-96cb-2917743b32ec-kube-api-access-dcm82\") pod \"must-gather-xcpdj\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.311844 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.311657 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/78641fbc-1d1b-4401-96cb-2917743b32ec-must-gather-output\") pod \"must-gather-xcpdj\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.312061 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.312044 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/78641fbc-1d1b-4401-96cb-2917743b32ec-must-gather-output\") pod \"must-gather-xcpdj\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.319786 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.319761 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcm82\" (UniqueName: \"kubernetes.io/projected/78641fbc-1d1b-4401-96cb-2917743b32ec-kube-api-access-dcm82\") pod \"must-gather-xcpdj\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.415040 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.414959 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:28.528631 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.528599 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsd4b/must-gather-xcpdj"] Apr 22 20:11:28.531475 ip-10-0-143-198 kubenswrapper[2564]: W0422 20:11:28.531444 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78641fbc_1d1b_4401_96cb_2917743b32ec.slice/crio-9c3f4b4684f2725caadac5eb7a26a61a2fb3428dd50c246523e0f478b5070229 WatchSource:0}: Error finding container 9c3f4b4684f2725caadac5eb7a26a61a2fb3428dd50c246523e0f478b5070229: Status 404 returned error can't find the container with id 9c3f4b4684f2725caadac5eb7a26a61a2fb3428dd50c246523e0f478b5070229 Apr 22 20:11:28.533131 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:28.533116 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:11:29.144184 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:29.141103 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" event={"ID":"78641fbc-1d1b-4401-96cb-2917743b32ec","Type":"ContainerStarted","Data":"9c3f4b4684f2725caadac5eb7a26a61a2fb3428dd50c246523e0f478b5070229"} Apr 22 20:11:30.047352 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.047329 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" Apr 22 20:11:30.145093 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.145058 2564 generic.go:358] "Generic (PLEG): container finished" podID="e12c4ff3-d289-4622-a303-df834301b56f" containerID="b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805" exitCode=0 Apr 22 20:11:30.145530 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.145115 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" Apr 22 20:11:30.145530 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.145141 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" event={"ID":"e12c4ff3-d289-4622-a303-df834301b56f","Type":"ContainerDied","Data":"b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805"} Apr 22 20:11:30.145530 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.145182 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" event={"ID":"e12c4ff3-d289-4622-a303-df834301b56f","Type":"ContainerDied","Data":"8a17fc691a5f4e58553adfc2210291af072e7f7c0ed37bb4a535be8cd2a12926"} Apr 22 20:11:30.145530 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.145203 2564 scope.go:117] "RemoveContainer" containerID="b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805" Apr 22 20:11:30.153233 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.153217 2564 scope.go:117] "RemoveContainer" containerID="b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805" Apr 22 20:11:30.153473 ip-10-0-143-198 kubenswrapper[2564]: E0422 20:11:30.153453 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805\": container with ID starting with b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805 not found: ID does not exist" containerID="b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805" Apr 22 20:11:30.153524 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.153482 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805"} err="failed to get container status \"b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805\": rpc error: code = NotFound desc = could not find container \"b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805\": container with ID starting with b4ad2b0a04589804333a7e23428a63d0a326456fb424509cbb3d2ed4068e2805 not found: ID does not exist" Apr 22 20:11:30.166668 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.166641 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4"] Apr 22 20:11:30.170467 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.170443 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4"] Apr 22 20:11:30.367814 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.367781 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12c4ff3-d289-4622-a303-df834301b56f" path="/var/lib/kubelet/pods/e12c4ff3-d289-4622-a303-df834301b56f/volumes" Apr 22 20:11:30.991991 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:30.991876 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65828-predictor-b79ff5787-mv5m4" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: i/o timeout" Apr 22 20:11:31.229122 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:31.229079 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:34.160666 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:34.160626 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" event={"ID":"78641fbc-1d1b-4401-96cb-2917743b32ec","Type":"ContainerStarted","Data":"bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78"} Apr 22 20:11:34.161079 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:34.160673 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" event={"ID":"78641fbc-1d1b-4401-96cb-2917743b32ec","Type":"ContainerStarted","Data":"db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323"} Apr 22 20:11:34.177478 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:34.177432 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" podStartSLOduration=1.4374139879999999 podStartE2EDuration="6.177418914s" podCreationTimestamp="2026-04-22 20:11:28 +0000 UTC" firstStartedPulling="2026-04-22 20:11:28.533235499 +0000 UTC m=+2866.727430065" lastFinishedPulling="2026-04-22 20:11:33.273240416 +0000 UTC m=+2871.467434991" observedRunningTime="2026-04-22 20:11:34.176141113 +0000 UTC m=+2872.370335715" watchObservedRunningTime="2026-04-22 20:11:34.177418914 +0000 UTC m=+2872.371613501" Apr 22 20:11:36.229309 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:36.229269 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:41.228864 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:41.228816 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:41.229328 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:41.228943 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 20:11:41.717253 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:41.717218 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:42.409107 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:42.409074 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:43.107924 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:43.107895 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:43.778361 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:43.778326 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:44.435195 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:44.435169 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:45.094431 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:45.094396 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:45.762013 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:45.761976 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:46.229184 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:46.229149 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:46.438973 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:46.438937 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:47.114251 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:47.114221 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:47.801183 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:47.801156 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:48.473440 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:48.473411 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:49.181563 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:49.181540 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-65828-75747dbbc6-px2gk_a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/switch-graph-65828/0.log" Apr 22 20:11:51.215971 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:51.215937 2564 generic.go:358] "Generic (PLEG): container finished" podID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerID="db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323" exitCode=0 Apr 22 20:11:51.216390 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:51.216016 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" event={"ID":"78641fbc-1d1b-4401-96cb-2917743b32ec","Type":"ContainerDied","Data":"db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323"} Apr 22 20:11:51.216390 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:51.216353 2564 scope.go:117] "RemoveContainer" containerID="db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323" Apr 22 20:11:51.228167 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:51.228134 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:51.753620 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:51.753582 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qsd4b_must-gather-xcpdj_78641fbc-1d1b-4401-96cb-2917743b32ec/gather/0.log" Apr 22 20:11:52.310647 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.310617 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-94jtb/must-gather-ldpr7"] Apr 22 20:11:52.311024 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.310992 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" Apr 22 20:11:52.311024 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.311005 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" Apr 22 20:11:52.311097 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.311059 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e12c4ff3-d289-4622-a303-df834301b56f" containerName="kserve-container" Apr 22 20:11:52.313089 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.313074 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.315632 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.315613 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-94jtb\"/\"kube-root-ca.crt\"" Apr 22 20:11:52.316758 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.316740 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-94jtb\"/\"default-dockercfg-btnhf\"" Apr 22 20:11:52.316836 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.316742 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-94jtb\"/\"openshift-service-ca.crt\"" Apr 22 20:11:52.323442 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.323359 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/must-gather-ldpr7"] Apr 22 20:11:52.339041 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.339019 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjnd\" (UniqueName: \"kubernetes.io/projected/b4a16b39-1c72-4225-9958-061f98be380c-kube-api-access-czjnd\") pod \"must-gather-ldpr7\" (UID: \"b4a16b39-1c72-4225-9958-061f98be380c\") " pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.339154 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.339064 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b4a16b39-1c72-4225-9958-061f98be380c-must-gather-output\") pod \"must-gather-ldpr7\" (UID: \"b4a16b39-1c72-4225-9958-061f98be380c\") " pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.439827 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.439800 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czjnd\" (UniqueName: \"kubernetes.io/projected/b4a16b39-1c72-4225-9958-061f98be380c-kube-api-access-czjnd\") pod \"must-gather-ldpr7\" (UID: \"b4a16b39-1c72-4225-9958-061f98be380c\") " pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.439990 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.439849 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b4a16b39-1c72-4225-9958-061f98be380c-must-gather-output\") pod \"must-gather-ldpr7\" (UID: \"b4a16b39-1c72-4225-9958-061f98be380c\") " pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.440200 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.440167 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b4a16b39-1c72-4225-9958-061f98be380c-must-gather-output\") pod \"must-gather-ldpr7\" (UID: \"b4a16b39-1c72-4225-9958-061f98be380c\") " pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.447581 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.447551 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjnd\" (UniqueName: \"kubernetes.io/projected/b4a16b39-1c72-4225-9958-061f98be380c-kube-api-access-czjnd\") pod \"must-gather-ldpr7\" (UID: \"b4a16b39-1c72-4225-9958-061f98be380c\") " pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.626532 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.626446 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/must-gather-ldpr7" Apr 22 20:11:52.744730 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:52.744688 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/must-gather-ldpr7"] Apr 22 20:11:52.746786 ip-10-0-143-198 kubenswrapper[2564]: W0422 20:11:52.746750 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4a16b39_1c72_4225_9958_061f98be380c.slice/crio-4f907463eccebe7094dab8dce296ba384f62cef544f468daac0488c973e4eb20 WatchSource:0}: Error finding container 4f907463eccebe7094dab8dce296ba384f62cef544f468daac0488c973e4eb20: Status 404 returned error can't find the container with id 4f907463eccebe7094dab8dce296ba384f62cef544f468daac0488c973e4eb20 Apr 22 20:11:53.222759 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:53.222724 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/must-gather-ldpr7" event={"ID":"b4a16b39-1c72-4225-9958-061f98be380c","Type":"ContainerStarted","Data":"4f907463eccebe7094dab8dce296ba384f62cef544f468daac0488c973e4eb20"} Apr 22 20:11:54.229486 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:54.229446 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/must-gather-ldpr7" event={"ID":"b4a16b39-1c72-4225-9958-061f98be380c","Type":"ContainerStarted","Data":"5198c11e333f869effd5e8455cdb217c4263b628701e553b550d6e9029121978"} Apr 22 20:11:54.230340 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:54.230315 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/must-gather-ldpr7" event={"ID":"b4a16b39-1c72-4225-9958-061f98be380c","Type":"ContainerStarted","Data":"96fbc4bb3a64a1812833367ac059f586e5066966e04c49f756c9279c3eede113"} Apr 22 20:11:54.245520 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:54.245462 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-94jtb/must-gather-ldpr7" podStartSLOduration=1.400219668 podStartE2EDuration="2.245448616s" podCreationTimestamp="2026-04-22 20:11:52 +0000 UTC" firstStartedPulling="2026-04-22 20:11:52.748490122 +0000 UTC m=+2890.942684687" lastFinishedPulling="2026-04-22 20:11:53.59371907 +0000 UTC m=+2891.787913635" observedRunningTime="2026-04-22 20:11:54.244344087 +0000 UTC m=+2892.438538677" watchObservedRunningTime="2026-04-22 20:11:54.245448616 +0000 UTC m=+2892.439643202" Apr 22 20:11:54.952990 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:54.952958 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wkh5z_c34d1177-43b6-4961-8baf-9e9e12cd0ee6/global-pull-secret-syncer/0.log" Apr 22 20:11:55.026931 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:55.026900 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-88jfr_60db5cd9-d42e-4ebb-b880-d777700e74ea/konnectivity-agent/0.log" Apr 22 20:11:55.114552 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:55.114526 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-198.ec2.internal_42a9abee4bf02f33d29bcb7ba1a804d2/haproxy/0.log" Apr 22 20:11:56.233881 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:56.233827 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:57.146461 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.146421 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qsd4b/must-gather-xcpdj"] Apr 22 20:11:57.146976 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.146943 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="copy" containerID="cri-o://bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78" gracePeriod=2 Apr 22 20:11:57.151947 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.151915 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qsd4b/must-gather-xcpdj"] Apr 22 20:11:57.152350 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.152323 2564 status_manager.go:895] "Failed to get status for pod" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" err="pods \"must-gather-xcpdj\" is forbidden: User \"system:node:ip-10-0-143-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qsd4b\": no relationship found between node 'ip-10-0-143-198.ec2.internal' and this object" Apr 22 20:11:57.251096 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.251046 2564 generic.go:358] "Generic (PLEG): container finished" podID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerID="1dc475452cb72e660b621ef5aa9b7a069abd4fea591fb2d5b8bb8e87d69f6c2a" exitCode=0 Apr 22 20:11:57.251626 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.251154 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" event={"ID":"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619","Type":"ContainerDied","Data":"1dc475452cb72e660b621ef5aa9b7a069abd4fea591fb2d5b8bb8e87d69f6c2a"} Apr 22 20:11:57.523407 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.523330 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qsd4b_must-gather-xcpdj_78641fbc-1d1b-4401-96cb-2917743b32ec/copy/0.log" Apr 22 20:11:57.524205 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.523977 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:57.526923 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.526882 2564 status_manager.go:895] "Failed to get status for pod" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" err="pods \"must-gather-xcpdj\" is forbidden: User \"system:node:ip-10-0-143-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qsd4b\": no relationship found between node 'ip-10-0-143-198.ec2.internal' and this object" Apr 22 20:11:57.594726 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.593151 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/78641fbc-1d1b-4401-96cb-2917743b32ec-must-gather-output\") pod \"78641fbc-1d1b-4401-96cb-2917743b32ec\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " Apr 22 20:11:57.594726 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.593250 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcm82\" (UniqueName: \"kubernetes.io/projected/78641fbc-1d1b-4401-96cb-2917743b32ec-kube-api-access-dcm82\") pod \"78641fbc-1d1b-4401-96cb-2917743b32ec\" (UID: \"78641fbc-1d1b-4401-96cb-2917743b32ec\") " Apr 22 20:11:57.594948 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.594884 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78641fbc-1d1b-4401-96cb-2917743b32ec-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "78641fbc-1d1b-4401-96cb-2917743b32ec" (UID: "78641fbc-1d1b-4401-96cb-2917743b32ec"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:57.597007 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.596726 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 20:11:57.602079 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.600671 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78641fbc-1d1b-4401-96cb-2917743b32ec-kube-api-access-dcm82" (OuterVolumeSpecName: "kube-api-access-dcm82") pod "78641fbc-1d1b-4401-96cb-2917743b32ec" (UID: "78641fbc-1d1b-4401-96cb-2917743b32ec"). InnerVolumeSpecName "kube-api-access-dcm82". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:11:57.620971 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.620924 2564 status_manager.go:895] "Failed to get status for pod" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" err="pods \"must-gather-xcpdj\" is forbidden: User \"system:node:ip-10-0-143-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qsd4b\": no relationship found between node 'ip-10-0-143-198.ec2.internal' and this object" Apr 22 20:11:57.693945 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.693841 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") pod \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " Apr 22 20:11:57.693945 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.693931 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-openshift-service-ca-bundle\") pod \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\" (UID: \"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619\") " Apr 22 20:11:57.694190 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.694176 2564 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/78641fbc-1d1b-4401-96cb-2917743b32ec-must-gather-output\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 20:11:57.694255 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.694194 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dcm82\" (UniqueName: \"kubernetes.io/projected/78641fbc-1d1b-4401-96cb-2917743b32ec-kube-api-access-dcm82\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 20:11:57.694575 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.694548 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" (UID: "a6d78a95-b6d5-4c2a-a1cc-4d9b25743619"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:11:57.711729 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.706809 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" (UID: "a6d78a95-b6d5-4c2a-a1cc-4d9b25743619"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:11:57.795662 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.795611 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-proxy-tls\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 20:11:57.795662 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:57.795663 2564 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619-openshift-service-ca-bundle\") on node \"ip-10-0-143-198.ec2.internal\" DevicePath \"\"" Apr 22 20:11:58.256542 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.256501 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" event={"ID":"a6d78a95-b6d5-4c2a-a1cc-4d9b25743619","Type":"ContainerDied","Data":"4e38166858de350408f73beed41d593b1feac094ea4fca497e3c51220827ac0c"} Apr 22 20:11:58.257005 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.256554 2564 scope.go:117] "RemoveContainer" containerID="1dc475452cb72e660b621ef5aa9b7a069abd4fea591fb2d5b8bb8e87d69f6c2a" Apr 22 20:11:58.257005 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.256723 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk" Apr 22 20:11:58.271351 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.269688 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qsd4b_must-gather-xcpdj_78641fbc-1d1b-4401-96cb-2917743b32ec/copy/0.log" Apr 22 20:11:58.271351 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.270164 2564 generic.go:358] "Generic (PLEG): container finished" podID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerID="bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78" exitCode=143 Apr 22 20:11:58.271351 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.270307 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" Apr 22 20:11:58.271351 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.271063 2564 status_manager.go:895] "Failed to get status for pod" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" err="pods \"must-gather-xcpdj\" is forbidden: User \"system:node:ip-10-0-143-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qsd4b\": no relationship found between node 'ip-10-0-143-198.ec2.internal' and this object" Apr 22 20:11:58.272994 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.272880 2564 status_manager.go:895] "Failed to get status for pod" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" err="pods \"must-gather-xcpdj\" is forbidden: User \"system:node:ip-10-0-143-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qsd4b\": no relationship found between node 'ip-10-0-143-198.ec2.internal' and this object" Apr 22 20:11:58.281423 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.281405 2564 scope.go:117] "RemoveContainer" containerID="bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78" Apr 22 20:11:58.301006 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.300907 2564 status_manager.go:895] "Failed to get status for pod" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" pod="openshift-must-gather-qsd4b/must-gather-xcpdj" err="pods \"must-gather-xcpdj\" is forbidden: User \"system:node:ip-10-0-143-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qsd4b\": no relationship found between node 'ip-10-0-143-198.ec2.internal' and this object" Apr 22 20:11:58.308716 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.308292 2564 scope.go:117] "RemoveContainer" containerID="db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323" Apr 22 20:11:58.321542 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.320237 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk"] Apr 22 20:11:58.321542 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.321470 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-65828-75747dbbc6-px2gk"] Apr 22 20:11:58.331560 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.331482 2564 scope.go:117] "RemoveContainer" containerID="bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78" Apr 22 20:11:58.332350 ip-10-0-143-198 kubenswrapper[2564]: E0422 20:11:58.332055 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78\": container with ID starting with bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78 not found: ID does not exist" containerID="bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78" Apr 22 20:11:58.332350 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.332104 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78"} err="failed to get container status \"bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78\": rpc error: code = NotFound desc = could not find container \"bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78\": container with ID starting with bdc655bfdcb7d80b69812103d6595ea1ead58bc179f6ef2c0f7f6928d8857a78 not found: ID does not exist" Apr 22 20:11:58.332350 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.332131 2564 scope.go:117] "RemoveContainer" containerID="db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323" Apr 22 20:11:58.332752 ip-10-0-143-198 kubenswrapper[2564]: E0422 20:11:58.332659 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323\": container with ID starting with db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323 not found: ID does not exist" containerID="db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323" Apr 22 20:11:58.332752 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.332689 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323"} err="failed to get container status \"db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323\": rpc error: code = NotFound desc = could not find container \"db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323\": container with ID starting with db1d88778cb42cfbfc9b0deade33b8f63dac9330546d9a43ba07215aa2946323 not found: ID does not exist" Apr 22 20:11:58.345719 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.343015 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/alertmanager/0.log" Apr 22 20:11:58.371715 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.370985 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" path="/var/lib/kubelet/pods/78641fbc-1d1b-4401-96cb-2917743b32ec/volumes" Apr 22 20:11:58.372036 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.371742 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" path="/var/lib/kubelet/pods/a6d78a95-b6d5-4c2a-a1cc-4d9b25743619/volumes" Apr 22 20:11:58.373676 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.373654 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/config-reloader/0.log" Apr 22 20:11:58.397947 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.397915 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/kube-rbac-proxy-web/0.log" Apr 22 20:11:58.421424 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.421391 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/kube-rbac-proxy/0.log" Apr 22 20:11:58.443512 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.443476 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/kube-rbac-proxy-metric/0.log" Apr 22 20:11:58.467869 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.467800 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/prom-label-proxy/0.log" Apr 22 20:11:58.490744 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.490679 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5c424d6e-0497-4717-a020-80361697c6d9/init-config-reloader/0.log" Apr 22 20:11:58.555369 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.555343 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-w5v7d_5c25b062-ea8e-4f67-b431-931fbd0173f4/kube-state-metrics/0.log" Apr 22 20:11:58.573222 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.572903 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-w5v7d_5c25b062-ea8e-4f67-b431-931fbd0173f4/kube-rbac-proxy-main/0.log" Apr 22 20:11:58.593004 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.592941 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-w5v7d_5c25b062-ea8e-4f67-b431-931fbd0173f4/kube-rbac-proxy-self/0.log" Apr 22 20:11:58.615788 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.615749 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7dd8c8d76-pknpl_44ee1a4f-9243-4b44-8982-b2d0a6bb4431/metrics-server/0.log" Apr 22 20:11:58.640406 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.640376 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4s5bq_a25a1776-3de7-4264-8c9d-13d256c65549/monitoring-plugin/0.log" Apr 22 20:11:58.674435 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.674398 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9lsf9_5322e235-8e02-4de9-8bd7-c1732a34c595/node-exporter/0.log" Apr 22 20:11:58.694814 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.694785 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9lsf9_5322e235-8e02-4de9-8bd7-c1732a34c595/kube-rbac-proxy/0.log" Apr 22 20:11:58.715792 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.715763 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9lsf9_5322e235-8e02-4de9-8bd7-c1732a34c595/init-textfile/0.log" Apr 22 20:11:58.864290 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.864261 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b6rv5_72dc47d5-c6f5-467f-ae4f-27ae71a19818/kube-rbac-proxy-main/0.log" Apr 22 20:11:58.885359 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.885329 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b6rv5_72dc47d5-c6f5-467f-ae4f-27ae71a19818/kube-rbac-proxy-self/0.log" Apr 22 20:11:58.908147 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.908116 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b6rv5_72dc47d5-c6f5-467f-ae4f-27ae71a19818/openshift-state-metrics/0.log" Apr 22 20:11:58.957127 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.957091 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/prometheus/0.log" Apr 22 20:11:58.976129 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.976104 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/config-reloader/0.log" Apr 22 20:11:58.996097 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:58.996069 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/thanos-sidecar/0.log" Apr 22 20:11:59.014880 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.014819 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/kube-rbac-proxy-web/0.log" Apr 22 20:11:59.036431 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.036402 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/kube-rbac-proxy/0.log" Apr 22 20:11:59.056858 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.056830 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/kube-rbac-proxy-thanos/0.log" Apr 22 20:11:59.077767 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.077735 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a37fef64-58f3-4e78-8cf7-c2b0e4415b6a/init-config-reloader/0.log" Apr 22 20:11:59.109074 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.109046 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-4tjh8_7bb4fa87-e378-4de8-8930-a788bc72560c/prometheus-operator/0.log" Apr 22 20:11:59.125194 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.125118 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-4tjh8_7bb4fa87-e378-4de8-8930-a788bc72560c/kube-rbac-proxy/0.log" Apr 22 20:11:59.172346 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.172322 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6ddbcb786b-49sn6_98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f/telemeter-client/0.log" Apr 22 20:11:59.191816 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.191786 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6ddbcb786b-49sn6_98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f/reload/0.log" Apr 22 20:11:59.211043 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.211015 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6ddbcb786b-49sn6_98922dc3-030c-4d6b-ba9f-c7bdbc8dc55f/kube-rbac-proxy/0.log" Apr 22 20:11:59.237868 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.237839 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-644d9fc5c6-272bn_9b1e50bb-3dd3-46b1-a930-24324c91640e/thanos-query/0.log" Apr 22 20:11:59.256507 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.256480 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-644d9fc5c6-272bn_9b1e50bb-3dd3-46b1-a930-24324c91640e/kube-rbac-proxy-web/0.log" Apr 22 20:11:59.276174 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.276138 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-644d9fc5c6-272bn_9b1e50bb-3dd3-46b1-a930-24324c91640e/kube-rbac-proxy/0.log" Apr 22 20:11:59.295253 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.295218 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-644d9fc5c6-272bn_9b1e50bb-3dd3-46b1-a930-24324c91640e/prom-label-proxy/0.log" Apr 22 20:11:59.318098 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.318063 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-644d9fc5c6-272bn_9b1e50bb-3dd3-46b1-a930-24324c91640e/kube-rbac-proxy-rules/0.log" Apr 22 20:11:59.337279 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:11:59.337257 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-644d9fc5c6-272bn_9b1e50bb-3dd3-46b1-a930-24324c91640e/kube-rbac-proxy-metrics/0.log" Apr 22 20:12:01.259143 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:01.259066 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5899c94d75-hqjsz_0f58728b-eff1-4516-bc7f-9544f7faf5b2/console/0.log" Apr 22 20:12:02.107277 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107246 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr"] Apr 22 20:12:02.107587 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107576 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="copy" Apr 22 20:12:02.107629 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107589 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="copy" Apr 22 20:12:02.107629 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107603 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="gather" Apr 22 20:12:02.107629 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107608 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="gather" Apr 22 20:12:02.107629 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107625 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" Apr 22 20:12:02.107803 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107630 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" Apr 22 20:12:02.107803 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107680 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="gather" Apr 22 20:12:02.107803 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107688 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="78641fbc-1d1b-4401-96cb-2917743b32ec" containerName="copy" Apr 22 20:12:02.107803 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.107720 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6d78a95-b6d5-4c2a-a1cc-4d9b25743619" containerName="switch-graph-65828" Apr 22 20:12:02.111932 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.111903 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.121509 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.121482 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr"] Apr 22 20:12:02.142341 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.142302 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gssbk\" (UniqueName: \"kubernetes.io/projected/815e11c0-d78e-4d1d-8e65-08b00647a6b3-kube-api-access-gssbk\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.142533 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.142400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-proc\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.142533 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.142437 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-sys\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.142533 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.142472 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-lib-modules\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.142724 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.142537 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-podres\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.244835 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.244800 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-podres\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245020 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.244871 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gssbk\" (UniqueName: \"kubernetes.io/projected/815e11c0-d78e-4d1d-8e65-08b00647a6b3-kube-api-access-gssbk\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245020 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.244911 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-proc\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245020 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.244946 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-sys\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245020 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.244981 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-lib-modules\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245228 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.245153 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-lib-modules\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245283 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.245232 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-podres\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245557 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.245535 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-proc\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.245627 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.245591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/815e11c0-d78e-4d1d-8e65-08b00647a6b3-sys\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.254207 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.254177 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gssbk\" (UniqueName: \"kubernetes.io/projected/815e11c0-d78e-4d1d-8e65-08b00647a6b3-kube-api-access-gssbk\") pod \"perf-node-gather-daemonset-vvnpr\" (UID: \"815e11c0-d78e-4d1d-8e65-08b00647a6b3\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.290478 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.290449 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dq5b5_341073cd-9280-4d10-acb9-b1c0b32e7850/dns/0.log" Apr 22 20:12:02.308845 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.308816 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dq5b5_341073cd-9280-4d10-acb9-b1c0b32e7850/kube-rbac-proxy/0.log" Apr 22 20:12:02.386139 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.386067 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7dhdx_38bec921-89a6-4a82-b51d-20431c5dedc1/dns-node-resolver/0.log" Apr 22 20:12:02.424463 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.424438 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:02.561534 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.561417 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr"] Apr 22 20:12:02.564971 ip-10-0-143-198 kubenswrapper[2564]: W0422 20:12:02.564938 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod815e11c0_d78e_4d1d_8e65_08b00647a6b3.slice/crio-0ed88a7b81e3266ccf0915f30b4db7f067940fd0df774e7a331db01243d97b95 WatchSource:0}: Error finding container 0ed88a7b81e3266ccf0915f30b4db7f067940fd0df774e7a331db01243d97b95: Status 404 returned error can't find the container with id 0ed88a7b81e3266ccf0915f30b4db7f067940fd0df774e7a331db01243d97b95 Apr 22 20:12:02.862256 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:02.862226 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v22lp_92374958-e1ad-48ca-bdd6-3c9a98c2e9e3/node-ca/0.log" Apr 22 20:12:03.289945 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:03.289912 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" event={"ID":"815e11c0-d78e-4d1d-8e65-08b00647a6b3","Type":"ContainerStarted","Data":"1d0a8a0de03569c3aa67b690e6e55677ceefca4c10cd3da627533441110bf42d"} Apr 22 20:12:03.290132 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:03.289952 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" event={"ID":"815e11c0-d78e-4d1d-8e65-08b00647a6b3","Type":"ContainerStarted","Data":"0ed88a7b81e3266ccf0915f30b4db7f067940fd0df774e7a331db01243d97b95"} Apr 22 20:12:03.290132 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:03.290047 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:03.307457 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:03.307412 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" podStartSLOduration=1.307393349 podStartE2EDuration="1.307393349s" podCreationTimestamp="2026-04-22 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:12:03.304895371 +0000 UTC m=+2901.499089959" watchObservedRunningTime="2026-04-22 20:12:03.307393349 +0000 UTC m=+2901.501587936" Apr 22 20:12:03.797628 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:03.797603 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k5rbc_5bcafd60-0344-423e-88a2-7e2ffae0f188/serve-healthcheck-canary/0.log" Apr 22 20:12:04.266183 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:04.266159 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nm6zf_3d4cf343-beba-4351-919c-473d70ddfbc4/kube-rbac-proxy/0.log" Apr 22 20:12:04.284187 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:04.284161 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nm6zf_3d4cf343-beba-4351-919c-473d70ddfbc4/exporter/0.log" Apr 22 20:12:04.304792 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:04.304767 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nm6zf_3d4cf343-beba-4351-919c-473d70ddfbc4/extractor/0.log" Apr 22 20:12:06.682525 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:06.682492 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-nzqdk_84048b42-efc8-4ee4-a171-ff4627d9d2e7/s3-init/0.log" Apr 22 20:12:09.301511 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:09.301483 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-vvnpr" Apr 22 20:12:11.200718 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.200660 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2222w_5092e3d5-3682-4a0c-bf3d-5313cc838278/kube-multus/0.log" Apr 22 20:12:11.255416 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.255380 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/kube-multus-additional-cni-plugins/0.log" Apr 22 20:12:11.275925 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.275899 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/egress-router-binary-copy/0.log" Apr 22 20:12:11.294926 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.294900 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/cni-plugins/0.log" Apr 22 20:12:11.314640 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.314614 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/bond-cni-plugin/0.log" Apr 22 20:12:11.333071 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.333032 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/routeoverride-cni/0.log" Apr 22 20:12:11.350583 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.350550 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/whereabouts-cni-bincopy/0.log" Apr 22 20:12:11.370830 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.370795 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8vbtk_41bc5667-8d3d-482d-9edb-6340167eb814/whereabouts-cni/0.log" Apr 22 20:12:11.740462 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.740425 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jblt6_6f6d5518-179f-4f70-8c2c-5b1b2a244e38/network-metrics-daemon/0.log" Apr 22 20:12:11.759326 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:11.759296 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jblt6_6f6d5518-179f-4f70-8c2c-5b1b2a244e38/kube-rbac-proxy/0.log" Apr 22 20:12:12.978409 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:12.978370 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-controller/0.log" Apr 22 20:12:12.996435 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:12.996413 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/0.log" Apr 22 20:12:13.012514 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.012442 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovn-acl-logging/1.log" Apr 22 20:12:13.031475 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.031451 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/kube-rbac-proxy-node/0.log" Apr 22 20:12:13.051352 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.051326 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:12:13.073732 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.073664 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/northd/0.log" Apr 22 20:12:13.097424 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.097399 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/nbdb/0.log" Apr 22 20:12:13.116529 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.116414 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/sbdb/0.log" Apr 22 20:12:13.221515 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:13.221481 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrxcm_90687231-1b2c-4845-9e57-cab76563d259/ovnkube-controller/0.log" Apr 22 20:12:14.088027 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:14.087997 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m4bzg_95433104-c840-4e8f-a3ff-c645c636f399/network-check-target-container/0.log" Apr 22 20:12:14.885854 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:14.885830 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rfw29_d46c700e-0847-4daf-bd26-4b29c5bee728/iptables-alerter/0.log" Apr 22 20:12:15.435526 ip-10-0-143-198 kubenswrapper[2564]: I0422 20:12:15.435498 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hvwp6_fb1d2f05-e6fc-4b8c-a646-fdebb0847854/tuned/0.log"