Apr 22 14:13:14.789137 ip-10-0-137-106 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 14:13:14.789148 ip-10-0-137-106 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 14:13:14.789158 ip-10-0-137-106 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 14:13:14.789457 ip-10-0-137-106 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 14:13:24.924283 ip-10-0-137-106 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 14:13:24.924297 ip-10-0-137-106 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 82180730b718429fbc674566619eb7c3 -- Apr 22 14:15:49.087109 ip-10-0-137-106 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:49.519238 ip-10-0-137-106 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:49.519238 ip-10-0-137-106 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:49.519238 ip-10-0-137-106 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:49.519238 ip-10-0-137-106 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:49.519238 ip-10-0-137-106 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:49.521466 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.521394 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:49.523480 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523465 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:49.523480 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523480 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523483 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523486 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523489 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523492 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523495 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523498 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523500 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523503 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523505 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523508 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523511 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523513 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523520 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523523 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523526 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523528 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523530 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523533 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523535 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:49.523543 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523538 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523541 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523543 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523546 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523549 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523551 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523554 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523557 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523560 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523562 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523565 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523567 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523570 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523572 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523575 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523578 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523580 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523584 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523588 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523590 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:49.524013 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523593 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523596 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523598 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523600 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523603 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523605 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523608 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523610 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523612 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523615 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523617 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523620 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523623 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523627 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523632 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523635 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523638 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523641 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523643 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:49.524509 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523646 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523648 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523651 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523653 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523656 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523660 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523663 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523665 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523668 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523670 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523672 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523675 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523677 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523680 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523682 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523684 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523687 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523689 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523692 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523694 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:49.524965 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523697 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523699 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523701 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523704 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523706 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.523709 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524110 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524118 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524122 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524125 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524127 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524130 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524133 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524135 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524138 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524141 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524143 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524146 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524149 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524152 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:49.525467 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524154 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524156 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524159 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524161 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524163 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524168 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524171 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524173 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524176 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524179 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524182 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524184 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524187 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524189 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524192 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524194 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524197 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524200 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524202 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524205 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:49.525947 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524208 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524210 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524213 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524216 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524219 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524221 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524224 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524226 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524229 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524231 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524234 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524237 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524239 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524241 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524244 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524246 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524249 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524251 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524253 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:49.526512 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524256 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524258 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524260 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524263 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524265 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524267 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524270 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524273 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524275 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524278 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524280 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524283 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524285 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524288 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524291 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524293 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524296 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524298 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524300 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:49.526991 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524303 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524305 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524307 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524310 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524312 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524314 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524319 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524322 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524325 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524328 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524331 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524333 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524335 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.524338 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525473 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525482 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525488 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525493 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525497 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525500 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525504 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:49.527453 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525508 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525511 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525514 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525517 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525521 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525524 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525527 2579 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525530 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525532 2579 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525535 2579 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525538 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525540 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525544 2579 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525547 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525549 2579 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525552 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525555 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525559 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525562 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525565 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525568 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525570 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525573 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525576 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525579 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:49.527969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525582 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525586 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525589 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525592 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525595 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525598 2579 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525601 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525605 2579 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525608 2579 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525611 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525614 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525616 2579 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525621 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525623 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525626 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525629 2579 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525632 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525635 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525638 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525641 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525644 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525646 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525649 2579 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525653 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525655 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:49.528563 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525658 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525662 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525665 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525668 2579 flags.go:64] FLAG: --help="false" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525670 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525673 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525676 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525679 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525682 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525686 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525689 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525692 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525694 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525697 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525700 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525703 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525706 2579 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525709 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525712 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525715 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525718 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525720 2579 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525723 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525726 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:49.529153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525729 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525733 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525736 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525739 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525742 2579 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525744 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525748 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525750 2579 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525753 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525757 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525760 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525765 2579 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525768 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525770 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525773 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525776 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525779 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525783 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525786 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525793 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525796 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525799 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525802 2579 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:49.529716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525805 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525810 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525813 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525816 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525819 2579 flags.go:64] FLAG: --port="10250" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525823 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525825 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-088d776d0db10d946" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525828 2579 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525843 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525846 2579 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525849 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525852 2579 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525855 2579 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525858 2579 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525861 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525864 2579 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525867 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525870 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525873 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525876 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525879 2579 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525882 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525885 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525888 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525891 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525894 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:49.530259 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525897 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525900 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525903 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525909 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525911 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525914 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525917 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525920 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525923 2579 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525926 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525932 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525934 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525937 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525941 2579 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525943 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525946 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525949 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525952 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525955 2579 flags.go:64] FLAG: --v="2" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525959 2579 flags.go:64] FLAG: --version="false" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525964 2579 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525967 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.525970 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526057 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:49.530891 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526061 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526064 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526066 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526070 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526073 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526076 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526080 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526083 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526086 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526088 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526091 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526095 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526098 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526100 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526103 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526106 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526108 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526111 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526115 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526118 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:49.531448 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526123 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526126 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526129 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526132 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526134 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526137 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526139 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526142 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526144 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526147 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526149 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526151 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526154 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526156 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526158 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526161 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526164 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526166 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526170 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526172 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:49.531973 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526175 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526177 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526180 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526184 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526186 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526189 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526191 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526194 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526196 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526198 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526201 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526204 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526206 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526209 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526211 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526213 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526216 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526218 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526221 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526223 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:49.532452 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526226 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526228 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526230 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526233 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526235 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526238 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526240 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526242 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526245 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526248 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526252 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526254 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526257 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526259 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526261 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526265 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526268 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526270 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526273 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:49.532949 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526275 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526278 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526280 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526282 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526285 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.526288 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.526945 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.532799 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.532813 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532887 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532893 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532896 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532899 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532902 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532905 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:49.533425 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532907 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532910 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532912 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532915 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532917 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532921 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532923 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532927 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532930 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532933 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532935 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532938 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532941 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532944 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532946 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532949 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532952 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532955 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532957 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532960 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:49.533792 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532963 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532966 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532968 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532971 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532973 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532976 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532979 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532983 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532986 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532988 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532991 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532993 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532996 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.532999 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533002 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533004 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533007 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533009 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533011 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533014 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:49.534289 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533020 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533023 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533026 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533028 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533031 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533034 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533036 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533039 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533041 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533044 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533046 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533049 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533051 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533054 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533056 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533058 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533061 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533064 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533066 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533068 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:49.534773 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533071 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533075 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533078 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533081 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533084 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533086 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533089 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533091 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533094 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533096 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533098 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533100 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533103 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533106 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533109 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533111 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533114 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533117 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533119 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:49.535305 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533122 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.533126 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533216 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533219 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533222 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533225 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533228 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533230 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533233 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533235 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533238 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533240 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533243 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533245 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533248 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:49.535741 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533251 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533253 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533256 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533258 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533260 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533263 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533265 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533267 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533270 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533272 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533274 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533277 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533280 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533282 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533285 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533288 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533290 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533293 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533295 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533298 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:49.536112 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533300 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533303 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533305 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533308 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533310 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533312 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533314 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533317 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533319 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533322 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533324 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533328 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533331 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533334 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533337 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533339 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533342 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533345 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533348 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:49.536608 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533350 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533352 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533355 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533357 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533360 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533362 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533365 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533367 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533370 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533372 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533375 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533377 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533379 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533382 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533385 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533388 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533391 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533393 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533396 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:49.537072 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533398 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533401 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533403 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533406 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533408 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533410 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533413 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533415 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533417 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533420 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533422 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533424 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533427 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533429 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:49.533432 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.533437 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:49.537524 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.534361 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:49.538015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.536490 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:49.538015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.537550 2579 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:49.538015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.537638 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:49.538015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.537673 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:49.564051 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.564034 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:49.568332 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.568305 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:49.582784 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.582767 2579 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:49.588489 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.588470 2579 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:49.591207 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.591184 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:49.595140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.595119 2579 fs.go:135] Filesystem UUIDs: map[18a01355-9ed9-4470-8a95-f785baaab80b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d043272e-00ea-436f-8cfc-81a83cd13c0f:/dev/nvme0n1p4] Apr 22 14:15:49.595225 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.595138 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:49.597295 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.597275 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:49.600528 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.600424 2579 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:49.598650115 +0000 UTC m=+0.401736062 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200523 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23bfa03752020cd55a102304eae567 SystemUUID:ec23bfa0-3752-020c-d55a-102304eae567 BootID:82180730-b718-429f-bc67-4566619eb7c3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bc:40:62:84:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bc:40:62:84:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:ac:aa:fb:c5:17 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:49.600528 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.600517 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:49.600688 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.600610 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:49.601603 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.601577 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:49.601760 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.601605 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-106.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:49.601861 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.601773 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:49.601861 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.601784 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:49.601861 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.601802 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:49.603552 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.603539 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:49.604400 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.604388 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:49.604514 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.604504 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:49.606788 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.606777 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:49.606867 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.606799 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:49.606867 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.606820 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:49.606867 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.606846 2579 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:49.606867 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.606858 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:49.607848 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.607824 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:49.607908 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.607874 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:49.610611 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.610596 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:49.611942 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.611926 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:49.613487 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613476 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613493 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613499 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613505 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613510 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613516 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613522 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613527 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613533 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:49.613540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613539 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:49.613752 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613551 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:49.613752 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.613559 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:49.614403 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.614394 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:49.614403 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.614403 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:49.615634 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.615596 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5sxpw" Apr 22 14:15:49.618825 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.618809 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:49.618901 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.618870 2579 server.go:1295] "Started kubelet" Apr 22 14:15:49.618977 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.618937 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:49.619246 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.619200 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:49.619350 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.619261 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:49.619475 ip-10-0-137-106 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:49.620385 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.620369 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:49.620454 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.620402 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-106.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:49.620602 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.620575 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:49.620602 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.620577 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:49.621735 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.621721 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:49.624628 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.624610 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5sxpw" Apr 22 14:15:49.625393 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.625372 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:49.625393 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.625387 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:49.626042 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626026 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:49.626117 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626046 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:49.626185 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626171 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:49.626185 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626182 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:49.626275 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.626182 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:49.626275 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626239 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:49.626363 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626303 2579 factory.go:55] Registering systemd factory Apr 22 14:15:49.626363 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626322 2579 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:49.626552 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626540 2579 factory.go:153] Registering CRI-O factory Apr 22 14:15:49.626627 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626620 2579 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:49.626745 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626726 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:49.626851 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626827 2579 factory.go:103] Registering Raw factory Apr 22 14:15:49.626935 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.626927 2579 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:49.627344 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.627321 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:49.627540 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.627524 2579 manager.go:319] Starting recovery of all containers Apr 22 14:15:49.628021 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.627032 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-106.ec2.internal.18a8b370b30512c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-106.ec2.internal,UID:ip-10-0-137-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-106.ec2.internal,},FirstTimestamp:2026-04-22 14:15:49.618823877 +0000 UTC m=+0.421909824,LastTimestamp:2026-04-22 14:15:49.618823877 +0000 UTC m=+0.421909824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-106.ec2.internal,}" Apr 22 14:15:49.633200 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.633163 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:49.635486 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.635463 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:49.637709 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.637687 2579 manager.go:324] Recovery completed Apr 22 14:15:49.638243 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.638221 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-106.ec2.internal\" not found" node="ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.642612 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.642525 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:49.644778 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.644764 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:49.644857 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.644790 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:49.644857 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.644800 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:49.645237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.645226 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:49.645237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.645236 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:49.645297 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.645250 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:49.647688 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.647678 2579 policy_none.go:49] "None policy: Start" Apr 22 14:15:49.647725 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.647693 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:49.647725 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.647702 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:49.680529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680515 2579 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:49.680591 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.680574 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:49.680591 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680586 2579 server.go:85] "Starting device plugin registration server" Apr 22 14:15:49.680800 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680788 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:49.680858 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680803 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:49.680932 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680909 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:49.680999 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680988 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:49.680999 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.680997 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:49.681425 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.681405 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:49.681498 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.681445 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:49.756013 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.755994 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:49.756900 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.756019 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:49.756900 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.756035 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:49.756900 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.756041 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:49.756900 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.756068 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:49.758701 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.758684 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:49.781934 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.781887 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:49.782592 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.782579 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:49.782654 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.782603 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:49.782654 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.782613 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:49.782654 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.782633 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.788878 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.788862 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.788930 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.788879 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-106.ec2.internal\": node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:49.802686 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.802672 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:49.856616 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.856593 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal"] Apr 22 14:15:49.856702 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.856650 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:49.858170 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.858156 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:49.858240 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.858176 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:49.858240 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.858185 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:49.859472 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.859461 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:49.859605 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.859591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.859641 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.859626 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:49.860270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.860254 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:49.860331 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.860280 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:49.860331 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.860294 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:49.860331 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.860324 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:49.860426 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.860340 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:49.860426 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.860350 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:49.861901 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.861888 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.861954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.861910 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:49.862581 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.862567 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:49.862652 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.862592 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:49.862652 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.862604 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:49.892322 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.892300 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-106.ec2.internal\" not found" node="ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.896472 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.896457 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-106.ec2.internal\" not found" node="ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.903120 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:49.903107 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:49.928360 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.928342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.928405 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.928368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:49.928405 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:49.928385 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/daadd8592f265af4bb30938041dae753-config\") pod \"kube-apiserver-proxy-ip-10-0-137-106.ec2.internal\" (UID: \"daadd8592f265af4bb30938041dae753\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.003906 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.003888 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.029363 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.029341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.029441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.029374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.029441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.029392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/daadd8592f265af4bb30938041dae753-config\") pod \"kube-apiserver-proxy-ip-10-0-137-106.ec2.internal\" (UID: \"daadd8592f265af4bb30938041dae753\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.029441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.029427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.029555 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.029437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/905765c51ae51a6f36edf080ae0fc9ae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal\" (UID: \"905765c51ae51a6f36edf080ae0fc9ae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.029555 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.029485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/daadd8592f265af4bb30938041dae753-config\") pod \"kube-apiserver-proxy-ip-10-0-137-106.ec2.internal\" (UID: \"daadd8592f265af4bb30938041dae753\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.104464 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.104411 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.194030 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.194000 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.199507 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.199492 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.205197 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.205178 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.305661 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.305634 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.406208 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.406155 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.506741 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.506707 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.537227 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.537209 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:50.537618 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.537341 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:50.537618 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.537364 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:50.580171 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.580155 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:50.606795 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:50.606768 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-106.ec2.internal\" not found" Apr 22 14:15:50.626110 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.626095 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:50.627646 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.627619 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:49 +0000 UTC" deadline="2027-10-22 04:02:43.745049858 +0000 UTC" Apr 22 14:15:50.627646 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.627642 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13141h46m53.117410387s" Apr 22 14:15:50.638140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.638116 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:50.638954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.638939 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:50.658086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.658068 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nb269" Apr 22 14:15:50.666369 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:50.666339 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod905765c51ae51a6f36edf080ae0fc9ae.slice/crio-3a66f6c2ea15928885333d0abba164144f644c1a5af7338e7527523bb03a9b5b WatchSource:0}: Error finding container 3a66f6c2ea15928885333d0abba164144f644c1a5af7338e7527523bb03a9b5b: Status 404 returned error can't find the container with id 3a66f6c2ea15928885333d0abba164144f644c1a5af7338e7527523bb03a9b5b Apr 22 14:15:50.666759 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.666741 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nb269" Apr 22 14:15:50.671218 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:50.671196 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaadd8592f265af4bb30938041dae753.slice/crio-3529aa67d27f2f5a97f56c4031ac1943d2afe56d61afee797b4394ceba7d886b WatchSource:0}: Error finding container 3529aa67d27f2f5a97f56c4031ac1943d2afe56d61afee797b4394ceba7d886b: Status 404 returned error can't find the container with id 3529aa67d27f2f5a97f56c4031ac1943d2afe56d61afee797b4394ceba7d886b Apr 22 14:15:50.671386 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.671374 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:50.726549 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.726525 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.737399 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.737383 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:50.738175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.738165 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" Apr 22 14:15:50.748371 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.748355 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:50.759271 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.759228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" event={"ID":"daadd8592f265af4bb30938041dae753","Type":"ContainerStarted","Data":"3529aa67d27f2f5a97f56c4031ac1943d2afe56d61afee797b4394ceba7d886b"} Apr 22 14:15:50.760088 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:50.760073 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" event={"ID":"905765c51ae51a6f36edf080ae0fc9ae","Type":"ContainerStarted","Data":"3a66f6c2ea15928885333d0abba164144f644c1a5af7338e7527523bb03a9b5b"} Apr 22 14:15:51.410013 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.409982 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:51.608165 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.608136 2579 apiserver.go:52] "Watching apiserver" Apr 22 14:15:51.617297 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.617276 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:51.619368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.619345 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vvnzr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal","openshift-multus/multus-additional-cni-plugins-zs8tx","openshift-multus/multus-f6c4x","openshift-multus/network-metrics-daemon-f984g","openshift-network-diagnostics/network-check-target-r8r2p","kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw","openshift-cluster-node-tuning-operator/tuned-dndj5","openshift-network-operator/iptables-alerter-t8jdl","openshift-ovn-kubernetes/ovnkube-node-kbkr7","kube-system/konnectivity-agent-rc7xq","openshift-dns/node-resolver-v7x7t"] Apr 22 14:15:51.621276 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.621251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.622790 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.622766 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.624094 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.624046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.624441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.624425 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.624802 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.624784 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:51.624966 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.624940 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.625351 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.625309 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7qfm6\"" Apr 22 14:15:51.625934 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.625915 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.626138 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.626118 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.626239 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.626141 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:51.626696 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.626557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:51.626696 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.626570 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:51.626696 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.626623 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:51.626927 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.626702 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:15:51.626985 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.626934 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5sv9s\"" Apr 22 14:15:51.627066 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.627046 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:51.627527 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.627511 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-b4c54\"" Apr 22 14:15:51.629507 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.629462 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.630790 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.630767 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:51.630880 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.630859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.630880 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.630857 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:15:51.632086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.632061 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.632700 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.632685 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:51.633388 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.633350 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.633388 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.633361 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.633539 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.633432 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.633539 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.633352 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.634144 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.634126 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9qgh5\"" Apr 22 14:15:51.634227 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.634185 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-szcqw\"" Apr 22 14:15:51.634878 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.634856 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.634982 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.634893 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2dvwk\"" Apr 22 14:15:51.634982 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.634902 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.635424 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.635098 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.638008 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.637984 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.638094 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.638009 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:51.638994 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.638641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.638994 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.638764 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639456 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-socket-dir-parent\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrstm\" (UniqueName: \"kubernetes.io/projected/173094ab-3108-4fc8-b54a-fd252e895dcb-kube-api-access-qrstm\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27vp\" (UniqueName: \"kubernetes.io/projected/ce08d471-53a2-45d6-8243-b4455eddce85-kube-api-access-v27vp\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-cnibin\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639755 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-llt9k\"" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639779 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7dpv\" (UniqueName: \"kubernetes.io/projected/2f37f53f-e8b0-4392-a64b-126947d423d8-kube-api-access-w7dpv\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-cni-multus\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639915 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-kubelet\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.639950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-conf-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-system-cni-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-cni-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-os-release\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.640547 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/173094ab-3108-4fc8-b54a-fd252e895dcb-cni-binary-copy\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-netns\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-hostroot\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640418 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-daemon-config\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-host-slash\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-os-release\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-k8s-cni-cncf-io\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640606 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-multus-certs\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-etc-kubernetes\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640650 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640655 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640696 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-system-cni-dir\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fldkc\" (UniqueName: \"kubernetes.io/projected/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-kube-api-access-fldkc\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640820 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-cnibin\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.641298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640923 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-iptables-alerter-script\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.641986 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.640964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-cni-bin\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.642039 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.642010 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:51.642092 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.642055 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.642381 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.642365 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:51.642630 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.642611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:51.643672 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.643654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8vqk8\"" Apr 22 14:15:51.643776 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.643659 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:51.643776 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.643690 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sxftm\"" Apr 22 14:15:51.668207 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.668180 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:50 +0000 UTC" deadline="2027-11-19 17:27:27.048157885 +0000 UTC" Apr 22 14:15:51.668207 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.668206 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13827h11m35.379954983s" Apr 22 14:15:51.683325 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.683307 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:51.727913 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.727895 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:51.741351 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-sys-fs\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.741439 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.741439 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb5dd517-9720-4de6-8414-93da58e2c537-serviceca\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.741439 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741411 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gdmz\" (UniqueName: \"kubernetes.io/projected/71469e01-bd55-46b1-94cf-df4a59728e32-kube-api-access-9gdmz\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-cni-bin\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-device-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741529 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-cni-bin\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-socket-dir-parent\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.741586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrstm\" (UniqueName: \"kubernetes.io/projected/173094ab-3108-4fc8-b54a-fd252e895dcb-kube-api-access-qrstm\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741662 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysctl-conf\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-socket-dir-parent\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741696 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-systemd\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-node-log\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-systemd-units\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-var-lib-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-cnibin\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.741869 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741874 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7dpv\" (UniqueName: \"kubernetes.io/projected/2f37f53f-e8b0-4392-a64b-126947d423d8-kube-api-access-w7dpv\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-cnibin\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtlk\" (UniqueName: \"kubernetes.io/projected/eb5dd517-9720-4de6-8414-93da58e2c537-kube-api-access-trtlk\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4-konnectivity-ca\") pod \"konnectivity-agent-rc7xq\" (UID: \"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4\") " pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.741997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742021 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-system-cni-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-cni-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-modprobe-d\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-run-netns\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-cni-bin\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-system-cni-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4-agent-certs\") pod \"konnectivity-agent-rc7xq\" (UID: \"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4\") " pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05f8e253-60dd-45b7-a7a0-6117d9b61223-tmp-dir\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.742210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742225 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-slash\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5733e414-2e85-4ac0-ac62-62aace56bf89-ovn-node-metrics-cert\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-cni-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqp8\" (UniqueName: \"kubernetes.io/projected/05f8e253-60dd-45b7-a7a0-6117d9b61223-kube-api-access-ttqp8\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.742296 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-kubelet-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-socket-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.742359 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.242338814 +0000 UTC m=+3.045424749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742365 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3966bb5-f85b-4c6b-9657-66a9792e096b-tmp\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x69fp\" (UniqueName: \"kubernetes.io/projected/a3966bb5-f85b-4c6b-9657-66a9792e096b-kube-api-access-x69fp\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-kubernetes\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742531 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-run-ovn-kubernetes\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-system-cni-dir\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-host\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.742866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-tuned\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742656 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-kubelet\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-system-cni-dir\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742677 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb5dd517-9720-4de6-8414-93da58e2c537-host\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldkc\" (UniqueName: \"kubernetes.io/projected/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-kube-api-access-fldkc\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-cnibin\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742751 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysconfig\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-cnibin\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-lib-modules\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-log-socket\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-cni-netd\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-iptables-alerter-script\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.742964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v27vp\" (UniqueName: \"kubernetes.io/projected/ce08d471-53a2-45d6-8243-b4455eddce85-kube-api-access-v27vp\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysctl-d\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-etc-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-ovnkube-script-lib\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.743568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743105 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-etc-selinux\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-cni-multus\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-kubelet\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743178 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-conf-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-kubelet\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-var-lib-cni-multus\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-var-lib-kubelet\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-os-release\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-conf-dir\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/173094ab-3108-4fc8-b54a-fd252e895dcb-cni-binary-copy\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-netns\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-hostroot\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743360 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-os-release\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-ovn\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-hostroot\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743772 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-env-overrides\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/05f8e253-60dd-45b7-a7a0-6117d9b61223-hosts-file\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.744377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-iptables-alerter-script\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-daemon-config\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743982 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-ovnkube-config\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744016 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-systemd\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/173094ab-3108-4fc8-b54a-fd252e895dcb-multus-daemon-config\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-host-slash\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-os-release\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-k8s-cni-cncf-io\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f37f53f-e8b0-4392-a64b-126947d423d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f37f53f-e8b0-4392-a64b-126947d423d8-os-release\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-k8s-cni-cncf-io\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-multus-certs\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-etc-kubernetes\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.744977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-run\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.745006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44sn\" (UniqueName: \"kubernetes.io/projected/5733e414-2e85-4ac0-ac62-62aace56bf89-kube-api-access-c44sn\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.745028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-sys\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.745131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.745056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-registration-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.745797 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.745096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-etc-kubernetes\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.745797 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.745151 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-host-slash\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.745797 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.745206 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-multus-certs\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.747081 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.743379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/173094ab-3108-4fc8-b54a-fd252e895dcb-host-run-netns\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.747673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.747652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/173094ab-3108-4fc8-b54a-fd252e895dcb-cni-binary-copy\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.749983 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.749964 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:51.750109 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.749985 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:51.750109 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.750000 2579 projected.go:194] Error preparing data for projected volume kube-api-access-h8bkj for pod openshift-network-diagnostics/network-check-target-r8r2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:51.750109 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:51.750052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj podName:5ca52a4a-534a-4287-b705-152b1ff23251 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.250036611 +0000 UTC m=+3.053122565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bkj" (UniqueName: "kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj") pod "network-check-target-r8r2p" (UID: "5ca52a4a-534a-4287-b705-152b1ff23251") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:51.750255 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.750234 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:51.754033 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.754014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrstm\" (UniqueName: \"kubernetes.io/projected/173094ab-3108-4fc8-b54a-fd252e895dcb-kube-api-access-qrstm\") pod \"multus-f6c4x\" (UID: \"173094ab-3108-4fc8-b54a-fd252e895dcb\") " pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.754033 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.754027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7dpv\" (UniqueName: \"kubernetes.io/projected/2f37f53f-e8b0-4392-a64b-126947d423d8-kube-api-access-w7dpv\") pod \"multus-additional-cni-plugins-zs8tx\" (UID: \"2f37f53f-e8b0-4392-a64b-126947d423d8\") " pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.754142 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.754070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fldkc\" (UniqueName: \"kubernetes.io/projected/74a0fd2f-c15e-48d5-a145-87b3aae2dab2-kube-api-access-fldkc\") pod \"iptables-alerter-t8jdl\" (UID: \"74a0fd2f-c15e-48d5-a145-87b3aae2dab2\") " pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.754142 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.754070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27vp\" (UniqueName: \"kubernetes.io/projected/ce08d471-53a2-45d6-8243-b4455eddce85-kube-api-access-v27vp\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:51.845778 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-modprobe-d\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.845939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-run-netns\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.845939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-cni-bin\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.845939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-cni-bin\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.845939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-run-netns\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.845939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-modprobe-d\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.845939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4-agent-certs\") pod \"konnectivity-agent-rc7xq\" (UID: \"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4\") " pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.846192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05f8e253-60dd-45b7-a7a0-6117d9b61223-tmp-dir\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.846192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.845993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-slash\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-slash\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5733e414-2e85-4ac0-ac62-62aace56bf89-ovn-node-metrics-cert\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttqp8\" (UniqueName: \"kubernetes.io/projected/05f8e253-60dd-45b7-a7a0-6117d9b61223-kube-api-access-ttqp8\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.846192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-kubelet-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-socket-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3966bb5-f85b-4c6b-9657-66a9792e096b-tmp\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846270 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x69fp\" (UniqueName: \"kubernetes.io/projected/a3966bb5-f85b-4c6b-9657-66a9792e096b-kube-api-access-x69fp\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-kubernetes\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05f8e253-60dd-45b7-a7a0-6117d9b61223-tmp-dir\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-run-ovn-kubernetes\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-host\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-tuned\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-kubelet\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb5dd517-9720-4de6-8414-93da58e2c537-host\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.846462 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysconfig\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-lib-modules\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846502 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-log-socket\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-cni-netd\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysctl-d\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-etc-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-ovnkube-script-lib\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysconfig\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-etc-selinux\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-var-lib-kubelet\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-kubelet\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846691 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-ovn\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846755 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb5dd517-9720-4de6-8414-93da58e2c537-host\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846767 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-cni-netd\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-var-lib-kubelet\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-kubernetes\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-kubelet-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-ovn\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.846948 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846848 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-run-ovn-kubernetes\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-etc-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-host\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846922 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-socket-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-lib-modules\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-env-overrides\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846984 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysctl-d\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-etc-selinux\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.846992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/05f8e253-60dd-45b7-a7a0-6117d9b61223-hosts-file\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/05f8e253-60dd-45b7-a7a0-6117d9b61223-hosts-file\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-ovnkube-config\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-systemd\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-log-socket\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-systemd\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-run\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847271 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44sn\" (UniqueName: \"kubernetes.io/projected/5733e414-2e85-4ac0-ac62-62aace56bf89-kube-api-access-c44sn\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-sys\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-registration-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.847750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-sys-fs\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb5dd517-9720-4de6-8414-93da58e2c537-serviceca\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847386 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-env-overrides\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gdmz\" (UniqueName: \"kubernetes.io/projected/71469e01-bd55-46b1-94cf-df4a59728e32-kube-api-access-9gdmz\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-run\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-ovnkube-script-lib\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-device-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847532 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-registration-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysctl-conf\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5733e414-2e85-4ac0-ac62-62aace56bf89-ovnkube-config\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-systemd\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-sys-fs\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-node-log\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-node-log\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-systemd-units\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-var-lib-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.848579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-sysctl-conf\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trtlk\" (UniqueName: \"kubernetes.io/projected/eb5dd517-9720-4de6-8414-93da58e2c537-kube-api-access-trtlk\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-sys\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-run-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-var-lib-openvswitch\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5733e414-2e85-4ac0-ac62-62aace56bf89-systemd-units\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-systemd\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.847977 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4-konnectivity-ca\") pod \"konnectivity-agent-rc7xq\" (UID: \"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4\") " pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.848283 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb5dd517-9720-4de6-8414-93da58e2c537-serviceca\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.848471 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4-konnectivity-ca\") pod \"konnectivity-agent-rc7xq\" (UID: \"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4\") " pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.848536 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4-agent-certs\") pod \"konnectivity-agent-rc7xq\" (UID: \"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4\") " pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.848744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3966bb5-f85b-4c6b-9657-66a9792e096b-tmp\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.848779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3966bb5-f85b-4c6b-9657-66a9792e096b-etc-tuned\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.849015 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71469e01-bd55-46b1-94cf-df4a59728e32-device-dir\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.849427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.849384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5733e414-2e85-4ac0-ac62-62aace56bf89-ovn-node-metrics-cert\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.855472 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.855449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x69fp\" (UniqueName: \"kubernetes.io/projected/a3966bb5-f85b-4c6b-9657-66a9792e096b-kube-api-access-x69fp\") pod \"tuned-dndj5\" (UID: \"a3966bb5-f85b-4c6b-9657-66a9792e096b\") " pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.856128 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.856095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trtlk\" (UniqueName: \"kubernetes.io/projected/eb5dd517-9720-4de6-8414-93da58e2c537-kube-api-access-trtlk\") pod \"node-ca-vvnzr\" (UID: \"eb5dd517-9720-4de6-8414-93da58e2c537\") " pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.856418 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.856399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gdmz\" (UniqueName: \"kubernetes.io/projected/71469e01-bd55-46b1-94cf-df4a59728e32-kube-api-access-9gdmz\") pod \"aws-ebs-csi-driver-node-97lkw\" (UID: \"71469e01-bd55-46b1-94cf-df4a59728e32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.856500 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.856463 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttqp8\" (UniqueName: \"kubernetes.io/projected/05f8e253-60dd-45b7-a7a0-6117d9b61223-kube-api-access-ttqp8\") pod \"node-resolver-v7x7t\" (UID: \"05f8e253-60dd-45b7-a7a0-6117d9b61223\") " pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.856774 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.856758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44sn\" (UniqueName: \"kubernetes.io/projected/5733e414-2e85-4ac0-ac62-62aace56bf89-kube-api-access-c44sn\") pod \"ovnkube-node-kbkr7\" (UID: \"5733e414-2e85-4ac0-ac62-62aace56bf89\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.933611 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.933553 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t8jdl" Apr 22 14:15:51.942766 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.942741 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" Apr 22 14:15:51.951491 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.951473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f6c4x" Apr 22 14:15:51.955987 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.955969 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" Apr 22 14:15:51.963509 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.963488 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dndj5" Apr 22 14:15:51.970152 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.970133 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvnzr" Apr 22 14:15:51.978089 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.978074 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:15:51.984585 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.984566 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v7x7t" Apr 22 14:15:51.989117 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:51.989099 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:15:52.250213 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.250130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:52.250213 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.250181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:52.250436 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:52.250306 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:52.250436 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:52.250330 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:52.250436 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:52.250344 2579 projected.go:194] Error preparing data for projected volume kube-api-access-h8bkj for pod openshift-network-diagnostics/network-check-target-r8r2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:52.250436 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:52.250328 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:52.250436 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:52.250410 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:53.250394812 +0000 UTC m=+4.053480747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:52.250436 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:52.250435 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj podName:5ca52a4a-534a-4287-b705-152b1ff23251 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:53.25042494 +0000 UTC m=+4.053510881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bkj" (UniqueName: "kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj") pod "network-check-target-r8r2p" (UID: "5ca52a4a-534a-4287-b705-152b1ff23251") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:52.313189 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.313163 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b2d183_dda4_4cb2_9822_b4d4ed73f8c4.slice/crio-ab428c2f6b03edaf28ac60696e3eed6e10fb3a668a53cac83518153168e93196 WatchSource:0}: Error finding container ab428c2f6b03edaf28ac60696e3eed6e10fb3a668a53cac83518153168e93196: Status 404 returned error can't find the container with id ab428c2f6b03edaf28ac60696e3eed6e10fb3a668a53cac83518153168e93196 Apr 22 14:15:52.314224 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.314196 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod173094ab_3108_4fc8_b54a_fd252e895dcb.slice/crio-60932b92a88afe28f1011cc64a88f6170ee5fd3ba338bc8b24ae3f1f372d5a37 WatchSource:0}: Error finding container 60932b92a88afe28f1011cc64a88f6170ee5fd3ba338bc8b24ae3f1f372d5a37: Status 404 returned error can't find the container with id 60932b92a88afe28f1011cc64a88f6170ee5fd3ba338bc8b24ae3f1f372d5a37 Apr 22 14:15:52.315869 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.315826 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5733e414_2e85_4ac0_ac62_62aace56bf89.slice/crio-1a83e1712ff6f656b7f05b60505a233432a4700c317bb671d85e59f837876ef6 WatchSource:0}: Error finding container 1a83e1712ff6f656b7f05b60505a233432a4700c317bb671d85e59f837876ef6: Status 404 returned error can't find the container with id 1a83e1712ff6f656b7f05b60505a233432a4700c317bb671d85e59f837876ef6 Apr 22 14:15:52.317232 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.316972 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3966bb5_f85b_4c6b_9657_66a9792e096b.slice/crio-811e8dfecbd3dfb2fa6e4f589c6572d2162233d59104dcf100891b78c631e7a9 WatchSource:0}: Error finding container 811e8dfecbd3dfb2fa6e4f589c6572d2162233d59104dcf100891b78c631e7a9: Status 404 returned error can't find the container with id 811e8dfecbd3dfb2fa6e4f589c6572d2162233d59104dcf100891b78c631e7a9 Apr 22 14:15:52.319004 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.318980 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71469e01_bd55_46b1_94cf_df4a59728e32.slice/crio-d84cf217bf99b1a5d8989c2ba1dd120c675603b697847c752499abf19f41cba8 WatchSource:0}: Error finding container d84cf217bf99b1a5d8989c2ba1dd120c675603b697847c752499abf19f41cba8: Status 404 returned error can't find the container with id d84cf217bf99b1a5d8989c2ba1dd120c675603b697847c752499abf19f41cba8 Apr 22 14:15:52.320396 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.320375 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a0fd2f_c15e_48d5_a145_87b3aae2dab2.slice/crio-ba1708fcf1e0465243a42d70412dadde542c2d27c6b3740d9a8833f5ba03def2 WatchSource:0}: Error finding container ba1708fcf1e0465243a42d70412dadde542c2d27c6b3740d9a8833f5ba03def2: Status 404 returned error can't find the container with id ba1708fcf1e0465243a42d70412dadde542c2d27c6b3740d9a8833f5ba03def2 Apr 22 14:15:52.321169 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.321147 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f8e253_60dd_45b7_a7a0_6117d9b61223.slice/crio-d8b4f0f7fe60df09f81a1b0e905bdea526cf5cc73652e408808f8ab7cac9ca3a WatchSource:0}: Error finding container d8b4f0f7fe60df09f81a1b0e905bdea526cf5cc73652e408808f8ab7cac9ca3a: Status 404 returned error can't find the container with id d8b4f0f7fe60df09f81a1b0e905bdea526cf5cc73652e408808f8ab7cac9ca3a Apr 22 14:15:52.322302 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.322278 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb5dd517_9720_4de6_8414_93da58e2c537.slice/crio-52ff222f650d2673b4ea577359af1bf178e87e2451ef7996d74216befdb2b253 WatchSource:0}: Error finding container 52ff222f650d2673b4ea577359af1bf178e87e2451ef7996d74216befdb2b253: Status 404 returned error can't find the container with id 52ff222f650d2673b4ea577359af1bf178e87e2451ef7996d74216befdb2b253 Apr 22 14:15:52.323443 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:15:52.323405 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f37f53f_e8b0_4392_a64b_126947d423d8.slice/crio-61efff1da29bd33db2a2b9825c315d2dee60f9c383a43bc2db2e05207a110377 WatchSource:0}: Error finding container 61efff1da29bd33db2a2b9825c315d2dee60f9c383a43bc2db2e05207a110377: Status 404 returned error can't find the container with id 61efff1da29bd33db2a2b9825c315d2dee60f9c383a43bc2db2e05207a110377 Apr 22 14:15:52.668963 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.668672 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:50 +0000 UTC" deadline="2027-09-17 14:47:44.360143605 +0000 UTC" Apr 22 14:15:52.668963 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.668910 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12312h31m51.691239137s" Apr 22 14:15:52.777517 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.777441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t8jdl" event={"ID":"74a0fd2f-c15e-48d5-a145-87b3aae2dab2","Type":"ContainerStarted","Data":"ba1708fcf1e0465243a42d70412dadde542c2d27c6b3740d9a8833f5ba03def2"} Apr 22 14:15:52.780351 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.780321 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"1a83e1712ff6f656b7f05b60505a233432a4700c317bb671d85e59f837876ef6"} Apr 22 14:15:52.785461 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.785402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f6c4x" event={"ID":"173094ab-3108-4fc8-b54a-fd252e895dcb","Type":"ContainerStarted","Data":"60932b92a88afe28f1011cc64a88f6170ee5fd3ba338bc8b24ae3f1f372d5a37"} Apr 22 14:15:52.787445 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.787391 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" event={"ID":"71469e01-bd55-46b1-94cf-df4a59728e32","Type":"ContainerStarted","Data":"d84cf217bf99b1a5d8989c2ba1dd120c675603b697847c752499abf19f41cba8"} Apr 22 14:15:52.792488 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.792433 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dndj5" event={"ID":"a3966bb5-f85b-4c6b-9657-66a9792e096b","Type":"ContainerStarted","Data":"811e8dfecbd3dfb2fa6e4f589c6572d2162233d59104dcf100891b78c631e7a9"} Apr 22 14:15:52.797357 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.797077 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rc7xq" event={"ID":"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4","Type":"ContainerStarted","Data":"ab428c2f6b03edaf28ac60696e3eed6e10fb3a668a53cac83518153168e93196"} Apr 22 14:15:52.806036 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.805941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" event={"ID":"daadd8592f265af4bb30938041dae753","Type":"ContainerStarted","Data":"bd909d563f859e5e17fc9d6f132aaba9a1fe803ff77f57650eba91edd74e43d0"} Apr 22 14:15:52.811139 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.811114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerStarted","Data":"61efff1da29bd33db2a2b9825c315d2dee60f9c383a43bc2db2e05207a110377"} Apr 22 14:15:52.812731 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.812710 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvnzr" event={"ID":"eb5dd517-9720-4de6-8414-93da58e2c537","Type":"ContainerStarted","Data":"52ff222f650d2673b4ea577359af1bf178e87e2451ef7996d74216befdb2b253"} Apr 22 14:15:52.813805 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:52.813781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v7x7t" event={"ID":"05f8e253-60dd-45b7-a7a0-6117d9b61223","Type":"ContainerStarted","Data":"d8b4f0f7fe60df09f81a1b0e905bdea526cf5cc73652e408808f8ab7cac9ca3a"} Apr 22 14:15:53.260217 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.260136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:53.260385 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.260223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:53.260385 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.260343 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:53.260385 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.260360 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:53.260385 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.260373 2579 projected.go:194] Error preparing data for projected volume kube-api-access-h8bkj for pod openshift-network-diagnostics/network-check-target-r8r2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:53.260595 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.260428 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj podName:5ca52a4a-534a-4287-b705-152b1ff23251 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.2604096 +0000 UTC m=+6.063495548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bkj" (UniqueName: "kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj") pod "network-check-target-r8r2p" (UID: "5ca52a4a-534a-4287-b705-152b1ff23251") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:53.260808 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.260792 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:53.260900 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.260854 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.260824639 +0000 UTC m=+6.063910577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:53.738899 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.738793 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-106.ec2.internal" podStartSLOduration=3.738773198 podStartE2EDuration="3.738773198s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:52.82177924 +0000 UTC m=+3.624865199" watchObservedRunningTime="2026-04-22 14:15:53.738773198 +0000 UTC m=+4.541859157" Apr 22 14:15:53.739758 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.739735 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-svgjx"] Apr 22 14:15:53.741748 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.741728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.741868 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.741798 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:15:53.758300 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.758277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:53.758403 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.758385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:15:53.758731 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.758711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:53.758860 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.758825 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:15:53.834400 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.833566 2579 generic.go:358] "Generic (PLEG): container finished" podID="905765c51ae51a6f36edf080ae0fc9ae" containerID="759f168d3e68f5c68387c4f8df0aecbd1a014fab8c78c9b3c62162cc2c7221f3" exitCode=0 Apr 22 14:15:53.834400 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.833669 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" event={"ID":"905765c51ae51a6f36edf080ae0fc9ae","Type":"ContainerDied","Data":"759f168d3e68f5c68387c4f8df0aecbd1a014fab8c78c9b3c62162cc2c7221f3"} Apr 22 14:15:53.865720 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.865557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83a28155-8693-412d-81cc-13f4d8f8cf4b-kubelet-config\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.865720 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.865598 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83a28155-8693-412d-81cc-13f4d8f8cf4b-dbus\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.865720 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.865630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.966048 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.966015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83a28155-8693-412d-81cc-13f4d8f8cf4b-kubelet-config\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.966208 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.966056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83a28155-8693-412d-81cc-13f4d8f8cf4b-dbus\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.966208 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.966094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.966325 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.966218 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:53.966325 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:53.966277 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret podName:83a28155-8693-412d-81cc-13f4d8f8cf4b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.466258079 +0000 UTC m=+5.269344033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret") pod "global-pull-secret-syncer-svgjx" (UID: "83a28155-8693-412d-81cc-13f4d8f8cf4b") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:53.966417 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.966342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83a28155-8693-412d-81cc-13f4d8f8cf4b-kubelet-config\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:53.967128 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:53.966491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83a28155-8693-412d-81cc-13f4d8f8cf4b-dbus\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:54.469072 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:54.468915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:54.469248 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:54.469084 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:54.469248 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:54.469143 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret podName:83a28155-8693-412d-81cc-13f4d8f8cf4b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.469125109 +0000 UTC m=+6.272211056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret") pod "global-pull-secret-syncer-svgjx" (UID: "83a28155-8693-412d-81cc-13f4d8f8cf4b") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:54.838241 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:54.838144 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" event={"ID":"905765c51ae51a6f36edf080ae0fc9ae","Type":"ContainerStarted","Data":"8d1592a7f18a31c0eee1129b1570fe071bd2252fa60986c817437eaea0a5b0db"} Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:55.274417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:55.274456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.274581 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.274635 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.27461843 +0000 UTC m=+10.077704385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.274696 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.274705 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.274714 2579 projected.go:194] Error preparing data for projected volume kube-api-access-h8bkj for pod openshift-network-diagnostics/network-check-target-r8r2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:55.275220 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.274868 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj podName:5ca52a4a-534a-4287-b705-152b1ff23251 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.274858188 +0000 UTC m=+10.077944123 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bkj" (UniqueName: "kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj") pod "network-check-target-r8r2p" (UID: "5ca52a4a-534a-4287-b705-152b1ff23251") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:55.475821 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:55.475292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:55.475821 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.475451 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:55.475821 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.475508 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret podName:83a28155-8693-412d-81cc-13f4d8f8cf4b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.475489519 +0000 UTC m=+8.278575457 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret") pod "global-pull-secret-syncer-svgjx" (UID: "83a28155-8693-412d-81cc-13f4d8f8cf4b") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:55.759623 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:55.758575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:55.759623 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.758788 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:15:55.759623 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:55.759329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:55.759623 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.759480 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:15:55.759921 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:55.759899 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:55.762100 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:55.760194 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:15:57.491274 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:57.491241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:57.491698 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:57.491423 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:57.491698 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:57.491483 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret podName:83a28155-8693-412d-81cc-13f4d8f8cf4b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.491464446 +0000 UTC m=+12.294550383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret") pod "global-pull-secret-syncer-svgjx" (UID: "83a28155-8693-412d-81cc-13f4d8f8cf4b") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:57.757333 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:57.756789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:57.757333 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:57.756943 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:15:57.757333 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:57.757250 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:57.757579 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:57.757346 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:15:57.757579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:57.757514 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:57.757677 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:57.757656 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:15:59.306294 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:59.306239 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:59.306294 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:59.306298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:59.306828 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.306426 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:59.306828 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.306487 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:07.306468776 +0000 UTC m=+18.109554723 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:59.307069 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.306896 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:59.307069 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.306916 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:59.307069 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.306927 2579 projected.go:194] Error preparing data for projected volume kube-api-access-h8bkj for pod openshift-network-diagnostics/network-check-target-r8r2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:59.307069 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.306977 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj podName:5ca52a4a-534a-4287-b705-152b1ff23251 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:07.306962249 +0000 UTC m=+18.110048184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bkj" (UniqueName: "kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj") pod "network-check-target-r8r2p" (UID: "5ca52a4a-534a-4287-b705-152b1ff23251") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:59.757909 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:59.756599 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:15:59.757909 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.756720 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:15:59.757909 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:59.757050 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:15:59.757909 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:15:59.757080 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:15:59.757909 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.757146 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:15:59.757909 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:15:59.757235 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:01.526163 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:01.526088 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:01.526577 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:01.526247 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:16:01.526577 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:01.526328 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret podName:83a28155-8693-412d-81cc-13f4d8f8cf4b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:09.526306035 +0000 UTC m=+20.329391984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret") pod "global-pull-secret-syncer-svgjx" (UID: "83a28155-8693-412d-81cc-13f4d8f8cf4b") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:16:01.760015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:01.759983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:01.760166 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:01.760027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:01.760166 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:01.759983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:01.760166 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:01.760097 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:01.760320 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:01.760183 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:01.760320 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:01.760261 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:03.759576 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:03.759548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:03.760022 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:03.759551 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:03.760022 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:03.759660 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:03.760022 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:03.759550 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:03.760022 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:03.759736 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:03.760022 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:03.759825 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:05.756977 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:05.756946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:05.757470 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:05.756947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:05.757470 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:05.757086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:05.757470 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:05.757103 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:05.757470 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:05.757178 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:05.757470 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:05.757294 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:07.373075 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:07.373042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:07.373085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.373196 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.373219 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.373243 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.373256 2579 projected.go:194] Error preparing data for projected volume kube-api-access-h8bkj for pod openshift-network-diagnostics/network-check-target-r8r2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.373261 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:23.373240922 +0000 UTC m=+34.176326861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:16:07.373488 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.373305 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj podName:5ca52a4a-534a-4287-b705-152b1ff23251 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:23.373288188 +0000 UTC m=+34.176374128 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bkj" (UniqueName: "kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj") pod "network-check-target-r8r2p" (UID: "5ca52a4a-534a-4287-b705-152b1ff23251") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:16:07.756672 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:07.756601 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:07.756826 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:07.756608 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:07.756826 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:07.756733 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:07.756826 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.756727 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:07.756981 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.756875 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:07.756981 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:07.756948 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:09.593502 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.593223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:09.594288 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:09.593384 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:16:09.594288 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:09.593597 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret podName:83a28155-8693-412d-81cc-13f4d8f8cf4b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.593583119 +0000 UTC m=+36.396669054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret") pod "global-pull-secret-syncer-svgjx" (UID: "83a28155-8693-412d-81cc-13f4d8f8cf4b") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:16:09.757463 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.757434 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:09.757611 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:09.757530 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:09.757611 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.757539 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:09.757726 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:09.757659 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:09.757726 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.757693 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:09.757818 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:09.757757 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:09.860933 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.860904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" event={"ID":"71469e01-bd55-46b1-94cf-df4a59728e32","Type":"ContainerStarted","Data":"190c1510893178356556500e62449093c7d836eb2ee4f899fe7481b4d6d4aeac"} Apr 22 14:16:09.862237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.862203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dndj5" event={"ID":"a3966bb5-f85b-4c6b-9657-66a9792e096b","Type":"ContainerStarted","Data":"df2d57dbfcd72b22921e431857625c84b62e7e419847768b7059b26132522faf"} Apr 22 14:16:09.863544 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.863517 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rc7xq" event={"ID":"a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4","Type":"ContainerStarted","Data":"29938038ffc03cb4376fd3b7ddf15ffae4a2d0a4113e8dc51cc8a838c47f707a"} Apr 22 14:16:09.865215 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.865193 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f37f53f-e8b0-4392-a64b-126947d423d8" containerID="473384f7f8ab5e7d4a2c08f362050b98454456cef7efd1eb20dc23073f8a1814" exitCode=0 Apr 22 14:16:09.865338 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.865248 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerDied","Data":"473384f7f8ab5e7d4a2c08f362050b98454456cef7efd1eb20dc23073f8a1814"} Apr 22 14:16:09.866730 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.866649 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvnzr" event={"ID":"eb5dd517-9720-4de6-8414-93da58e2c537","Type":"ContainerStarted","Data":"5f7b37adebba711b840a83c6151da946d120d7f5d7dc8a943573f86cbf9b74de"} Apr 22 14:16:09.868044 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.868023 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v7x7t" event={"ID":"05f8e253-60dd-45b7-a7a0-6117d9b61223","Type":"ContainerStarted","Data":"a3c02e4075ae5c6ee07da498c1e711c302c7af71989fe96107f6c50f17688f82"} Apr 22 14:16:09.870703 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.870680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"5c73f2b2bbec48acee132435429dfdba7d28b0284409247031c987b68e6a138e"} Apr 22 14:16:09.870788 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.870708 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"59312c9cc7e744ee1ef65b7aaa228af95a985974cc9dc450484489efbbfb2e3e"} Apr 22 14:16:09.870788 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.870727 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"df8436e2e8d002b226ce4bea00741d350f5e3a14274b0cfcb28a980acc3c80af"} Apr 22 14:16:09.870788 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.870739 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"05430548524ff900f7fa7f98aef656cecd7295b8d3d12071d5c621c734079a27"} Apr 22 14:16:09.870788 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.870751 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"b6e1a5df2a8abca083752bbe72c59174239c32bf77d7c61eda8704436c808f05"} Apr 22 14:16:09.870788 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.870762 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"8fa7ba51b8730664cbb45a14e04edec28d8b4e49511496addd9646f364328984"} Apr 22 14:16:09.872121 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.872092 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f6c4x" event={"ID":"173094ab-3108-4fc8-b54a-fd252e895dcb","Type":"ContainerStarted","Data":"c45d1f728d5ea961516dbe00262188e75f4c42ca2f2b022b4da6893adf5ad90b"} Apr 22 14:16:09.881404 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.881370 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-106.ec2.internal" podStartSLOduration=19.881361144 podStartE2EDuration="19.881361144s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:54.855092239 +0000 UTC m=+5.658178197" watchObservedRunningTime="2026-04-22 14:16:09.881361144 +0000 UTC m=+20.684447099" Apr 22 14:16:09.881818 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.881791 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dndj5" podStartSLOduration=4.414252016 podStartE2EDuration="20.881784667s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.318613748 +0000 UTC m=+3.121699684" lastFinishedPulling="2026-04-22 14:16:08.786146396 +0000 UTC m=+19.589232335" observedRunningTime="2026-04-22 14:16:09.881223844 +0000 UTC m=+20.684309800" watchObservedRunningTime="2026-04-22 14:16:09.881784667 +0000 UTC m=+20.684870626" Apr 22 14:16:09.917886 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.917848 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v7x7t" podStartSLOduration=4.456655539 podStartE2EDuration="20.91782631s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.323073621 +0000 UTC m=+3.126159561" lastFinishedPulling="2026-04-22 14:16:08.784244385 +0000 UTC m=+19.587330332" observedRunningTime="2026-04-22 14:16:09.895653527 +0000 UTC m=+20.698739487" watchObservedRunningTime="2026-04-22 14:16:09.91782631 +0000 UTC m=+20.720912265" Apr 22 14:16:09.932721 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.932682 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vvnzr" podStartSLOduration=4.472859369 podStartE2EDuration="20.932669033s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.32427906 +0000 UTC m=+3.127365011" lastFinishedPulling="2026-04-22 14:16:08.784088725 +0000 UTC m=+19.587174675" observedRunningTime="2026-04-22 14:16:09.932669216 +0000 UTC m=+20.735755407" watchObservedRunningTime="2026-04-22 14:16:09.932669033 +0000 UTC m=+20.735754991" Apr 22 14:16:09.952331 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.952298 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rc7xq" podStartSLOduration=4.482669912 podStartE2EDuration="20.952285867s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.314896101 +0000 UTC m=+3.117982050" lastFinishedPulling="2026-04-22 14:16:08.784512068 +0000 UTC m=+19.587598005" observedRunningTime="2026-04-22 14:16:09.951968408 +0000 UTC m=+20.755054367" watchObservedRunningTime="2026-04-22 14:16:09.952285867 +0000 UTC m=+20.755371826" Apr 22 14:16:09.969206 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:09.969165 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f6c4x" podStartSLOduration=4.471088584 podStartE2EDuration="20.96915217s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.3157814 +0000 UTC m=+3.118867336" lastFinishedPulling="2026-04-22 14:16:08.813844983 +0000 UTC m=+19.616930922" observedRunningTime="2026-04-22 14:16:09.969108361 +0000 UTC m=+20.772194318" watchObservedRunningTime="2026-04-22 14:16:09.96915217 +0000 UTC m=+20.772238128" Apr 22 14:16:10.185081 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.185057 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:16:10.694058 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.693962 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:16:10.185078556Z","UUID":"620a1cb1-e7e1-4d0c-90bd-5746d31af70c","Handler":null,"Name":"","Endpoint":""} Apr 22 14:16:10.695774 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.695752 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:16:10.695900 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.695784 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:16:10.875822 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.875779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t8jdl" event={"ID":"74a0fd2f-c15e-48d5-a145-87b3aae2dab2","Type":"ContainerStarted","Data":"7be9972a306e2c45d0f1036a214403c3f31e25cf56bd1cf752274091521596f0"} Apr 22 14:16:10.878065 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.877974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" event={"ID":"71469e01-bd55-46b1-94cf-df4a59728e32","Type":"ContainerStarted","Data":"bc382cc3a60b6feed066a2c2cd6b44c425ab2f11d8ee6295bd93a132b0ad3f3c"} Apr 22 14:16:10.893653 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:10.893595 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t8jdl" podStartSLOduration=5.431256203 podStartE2EDuration="21.89358218s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.322142268 +0000 UTC m=+3.125228210" lastFinishedPulling="2026-04-22 14:16:08.784468239 +0000 UTC m=+19.587554187" observedRunningTime="2026-04-22 14:16:10.893356372 +0000 UTC m=+21.696442331" watchObservedRunningTime="2026-04-22 14:16:10.89358218 +0000 UTC m=+21.696668148" Apr 22 14:16:11.760297 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.760272 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:11.760746 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.760272 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:11.760746 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:11.760385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:11.760746 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.760273 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:11.760746 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:11.760477 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:11.760746 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:11.760517 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:11.803822 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.803794 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:16:11.804353 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.804336 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:16:11.886051 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.885988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"a2fa23731570d306fe00a0694eb9166464eead5c9ae80467a0e1a1234b7acbff"} Apr 22 14:16:11.888072 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.888033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" event={"ID":"71469e01-bd55-46b1-94cf-df4a59728e32","Type":"ContainerStarted","Data":"765e0d0cb93e12236ed1e8115597b2bb96ef59231779c5a2ebeaa5bb1260e1c5"} Apr 22 14:16:11.888491 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.888466 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:16:11.888941 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.888925 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rc7xq" Apr 22 14:16:11.916603 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:11.916566 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97lkw" podStartSLOduration=4.188687685 podStartE2EDuration="22.916553202s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.320718059 +0000 UTC m=+3.123803994" lastFinishedPulling="2026-04-22 14:16:11.048583575 +0000 UTC m=+21.851669511" observedRunningTime="2026-04-22 14:16:11.91602273 +0000 UTC m=+22.719108686" watchObservedRunningTime="2026-04-22 14:16:11.916553202 +0000 UTC m=+22.719639159" Apr 22 14:16:13.756760 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.756738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:13.757289 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.756755 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:13.757289 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.756738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:13.757289 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:13.756852 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:13.757289 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:13.756942 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:13.757289 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:13.757037 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:13.894865 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.894730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" event={"ID":"5733e414-2e85-4ac0-ac62-62aace56bf89","Type":"ContainerStarted","Data":"5304b1bb7a8f860ecba49ae9f4947f3909f677417157afbe7a5ec34e880f5218"} Apr 22 14:16:13.895075 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.895048 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:16:13.895179 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.895089 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:16:13.908990 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.908970 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:16:13.909881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.909866 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:16:13.928490 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:13.928440 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" podStartSLOduration=8.165978488 podStartE2EDuration="24.92842639s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.318131694 +0000 UTC m=+3.121217632" lastFinishedPulling="2026-04-22 14:16:09.080579599 +0000 UTC m=+19.883665534" observedRunningTime="2026-04-22 14:16:13.92803663 +0000 UTC m=+24.731122591" watchObservedRunningTime="2026-04-22 14:16:13.92842639 +0000 UTC m=+24.731512349" Apr 22 14:16:14.786102 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:14.786072 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:16:14.897751 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:14.897721 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f37f53f-e8b0-4392-a64b-126947d423d8" containerID="b308c1428e458f97ab059b1fcb0c16a5fd5163402e1b151b1be472e9e138bc29" exitCode=0 Apr 22 14:16:14.897882 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:14.897800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerDied","Data":"b308c1428e458f97ab059b1fcb0c16a5fd5163402e1b151b1be472e9e138bc29"} Apr 22 14:16:15.719863 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.719797 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f984g"] Apr 22 14:16:15.720058 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.719939 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:15.720058 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:15.720028 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:15.720488 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.720466 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-svgjx"] Apr 22 14:16:15.720592 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.720568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:15.720692 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:15.720671 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:15.721181 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.721159 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r8r2p"] Apr 22 14:16:15.721292 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.721244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:15.721356 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:15.721318 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:15.900770 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:15.900747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerStarted","Data":"4ca80a44fe2ec2f4e280e555a4229b7011b5bb13461936165ad30f0bb8655218"} Apr 22 14:16:16.904225 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:16.904193 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f37f53f-e8b0-4392-a64b-126947d423d8" containerID="4ca80a44fe2ec2f4e280e555a4229b7011b5bb13461936165ad30f0bb8655218" exitCode=0 Apr 22 14:16:16.904575 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:16.904249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerDied","Data":"4ca80a44fe2ec2f4e280e555a4229b7011b5bb13461936165ad30f0bb8655218"} Apr 22 14:16:17.756533 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:17.756323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:17.756670 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:17.756323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:17.756670 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:17.756564 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:17.756751 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:17.756679 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:17.756751 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:17.756383 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:17.756821 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:17.756773 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:17.907791 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:17.907763 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f37f53f-e8b0-4392-a64b-126947d423d8" containerID="db43dbe621dd6ce47b3a5dcc6a8d74293e11061ab8ef516ce4b34776f1e8ff32" exitCode=0 Apr 22 14:16:17.908087 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:17.907804 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerDied","Data":"db43dbe621dd6ce47b3a5dcc6a8d74293e11061ab8ef516ce4b34776f1e8ff32"} Apr 22 14:16:19.757818 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:19.757789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:19.758415 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:19.757908 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f984g" podUID="ce08d471-53a2-45d6-8243-b4455eddce85" Apr 22 14:16:19.758415 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:19.758008 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:19.758415 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:19.758105 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:19.758415 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:19.758133 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r8r2p" podUID="5ca52a4a-534a-4287-b705-152b1ff23251" Apr 22 14:16:19.758415 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:19.758183 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgjx" podUID="83a28155-8693-412d-81cc-13f4d8f8cf4b" Apr 22 14:16:21.548480 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.548451 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-106.ec2.internal" event="NodeReady" Apr 22 14:16:21.549031 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.548587 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:16:21.583559 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.583529 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-847f5f59fb-b8ckm"] Apr 22 14:16:21.617434 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.617361 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kh5ld"] Apr 22 14:16:21.617564 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.617525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.620509 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.620485 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fd7vw\"" Apr 22 14:16:21.620636 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.620578 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:16:21.620636 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.620610 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:16:21.620724 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.620710 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:16:21.626742 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.626722 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:16:21.631325 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.631303 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hjl8j"] Apr 22 14:16:21.631464 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.631449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.634132 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.634113 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:16:21.634240 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.634131 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7gvsw\"" Apr 22 14:16:21.634240 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.634223 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:16:21.645464 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.645444 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-847f5f59fb-b8ckm"] Apr 22 14:16:21.645548 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.645468 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kh5ld"] Apr 22 14:16:21.645548 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.645480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hjl8j"] Apr 22 14:16:21.645624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.645570 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:21.648229 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.648209 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:16:21.648881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.648486 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9tgv\"" Apr 22 14:16:21.648881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.648486 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:16:21.648881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.648535 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:16:21.756656 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.756631 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:21.756808 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.756631 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:21.756971 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.756639 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:21.759760 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.759743 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:21.759881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.759802 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7xm6q\"" Apr 22 14:16:21.759881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.759749 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4bnk6\"" Apr 22 14:16:21.760006 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.759957 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:21.760006 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.759977 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:16:21.760006 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.759992 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:21.788991 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.788969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-trusted-ca\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.788991 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.788999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpn6g\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-kube-api-access-fpn6g\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789132 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-image-registry-private-configuration\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789132 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-installation-pull-secrets\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:21.789210 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-ca-trust-extracted\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789305 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98cf922e-f6c2-4a14-ad74-c33e7bf501df-config-volume\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.789305 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.789305 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd7jj\" (UniqueName: \"kubernetes.io/projected/98cf922e-f6c2-4a14-ad74-c33e7bf501df-kube-api-access-cd7jj\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.789305 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-certificates\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789305 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789292 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789491 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-bound-sa-token\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.789491 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98cf922e-f6c2-4a14-ad74-c33e7bf501df-tmp-dir\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.789491 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.789357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bdjz\" (UniqueName: \"kubernetes.io/projected/2ee495b6-b62e-473d-b4c4-23cc635944c3-kube-api-access-2bdjz\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:21.890062 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.889993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-trusted-ca\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890062 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpn6g\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-kube-api-access-fpn6g\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-image-registry-private-configuration\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-installation-pull-secrets\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:21.890230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-ca-trust-extracted\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890411 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98cf922e-f6c2-4a14-ad74-c33e7bf501df-config-volume\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.890411 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.890295 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:21.890411 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.890367 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert podName:2ee495b6-b62e-473d-b4c4-23cc635944c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:22.390347382 +0000 UTC m=+33.193433317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert") pod "ingress-canary-hjl8j" (UID: "2ee495b6-b62e-473d-b4c4-23cc635944c3") : secret "canary-serving-cert" not found Apr 22 14:16:21.890551 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.890603 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd7jj\" (UniqueName: \"kubernetes.io/projected/98cf922e-f6c2-4a14-ad74-c33e7bf501df-kube-api-access-cd7jj\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.890603 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-ca-trust-extracted\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890687 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-certificates\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890687 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.890610 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:21.890687 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890687 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-bound-sa-token\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.890897 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.890712 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls podName:98cf922e-f6c2-4a14-ad74-c33e7bf501df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:22.390687826 +0000 UTC m=+33.193773773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls") pod "dns-default-kh5ld" (UID: "98cf922e-f6c2-4a14-ad74-c33e7bf501df") : secret "dns-default-metrics-tls" not found Apr 22 14:16:21.890897 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98cf922e-f6c2-4a14-ad74-c33e7bf501df-tmp-dir\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.890897 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bdjz\" (UniqueName: \"kubernetes.io/projected/2ee495b6-b62e-473d-b4c4-23cc635944c3-kube-api-access-2bdjz\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:21.890897 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.890803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98cf922e-f6c2-4a14-ad74-c33e7bf501df-config-volume\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.891056 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.890901 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:21.891056 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.890917 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-847f5f59fb-b8ckm: secret "image-registry-tls" not found Apr 22 14:16:21.891056 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:21.891004 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls podName:c50a20e2-d2f8-4020-ae37-2db6e8b8b778 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:22.390988191 +0000 UTC m=+33.194074144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls") pod "image-registry-847f5f59fb-b8ckm" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778") : secret "image-registry-tls" not found Apr 22 14:16:21.891173 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.891057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98cf922e-f6c2-4a14-ad74-c33e7bf501df-tmp-dir\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.891173 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.891084 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-trusted-ca\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.891365 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.891332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-certificates\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.894944 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.894921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-installation-pull-secrets\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.899340 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.899316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd7jj\" (UniqueName: \"kubernetes.io/projected/98cf922e-f6c2-4a14-ad74-c33e7bf501df-kube-api-access-cd7jj\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:21.899908 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.899885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpn6g\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-kube-api-access-fpn6g\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.900285 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.900242 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bdjz\" (UniqueName: \"kubernetes.io/projected/2ee495b6-b62e-473d-b4c4-23cc635944c3-kube-api-access-2bdjz\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:21.900766 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.900729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-bound-sa-token\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:21.905126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:21.905108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-image-registry-private-configuration\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:22.393882 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:22.393830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:22.393882 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:22.393884 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.393987 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.393988 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:22.394014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.394060 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls podName:98cf922e-f6c2-4a14-ad74-c33e7bf501df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:23.394042729 +0000 UTC m=+34.197128667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls") pod "dns-default-kh5ld" (UID: "98cf922e-f6c2-4a14-ad74-c33e7bf501df") : secret "dns-default-metrics-tls" not found Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.394091 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert podName:2ee495b6-b62e-473d-b4c4-23cc635944c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:23.394081714 +0000 UTC m=+34.197167653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert") pod "ingress-canary-hjl8j" (UID: "2ee495b6-b62e-473d-b4c4-23cc635944c3") : secret "canary-serving-cert" not found Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.394096 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:22.394121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.394106 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-847f5f59fb-b8ckm: secret "image-registry-tls" not found Apr 22 14:16:22.394470 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:22.394157 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls podName:c50a20e2-d2f8-4020-ae37-2db6e8b8b778 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:23.394144244 +0000 UTC m=+34.197230179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls") pod "image-registry-847f5f59fb-b8ckm" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778") : secret "image-registry-tls" not found Apr 22 14:16:23.401493 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.401462 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:23.401493 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.401496 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.401513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.401537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.401567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401614 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401620 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401653 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401678 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs podName:ce08d471-53a2-45d6-8243-b4455eddce85 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:55.40165938 +0000 UTC m=+66.204745334 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs") pod "network-metrics-daemon-f984g" (UID: "ce08d471-53a2-45d6-8243-b4455eddce85") : secret "metrics-daemon-secret" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401681 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401694 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-847f5f59fb-b8ckm: secret "image-registry-tls" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401695 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert podName:2ee495b6-b62e-473d-b4c4-23cc635944c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.401686414 +0000 UTC m=+36.204772353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert") pod "ingress-canary-hjl8j" (UID: "2ee495b6-b62e-473d-b4c4-23cc635944c3") : secret "canary-serving-cert" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401711 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls podName:98cf922e-f6c2-4a14-ad74-c33e7bf501df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.401703449 +0000 UTC m=+36.204789389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls") pod "dns-default-kh5ld" (UID: "98cf922e-f6c2-4a14-ad74-c33e7bf501df") : secret "dns-default-metrics-tls" not found Apr 22 14:16:23.402006 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:23.401754 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls podName:c50a20e2-d2f8-4020-ae37-2db6e8b8b778 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:25.401744462 +0000 UTC m=+36.204830398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls") pod "image-registry-847f5f59fb-b8ckm" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778") : secret "image-registry-tls" not found Apr 22 14:16:23.403841 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.403815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bkj\" (UniqueName: \"kubernetes.io/projected/5ca52a4a-534a-4287-b705-152b1ff23251-kube-api-access-h8bkj\") pod \"network-check-target-r8r2p\" (UID: \"5ca52a4a-534a-4287-b705-152b1ff23251\") " pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:23.568660 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.568637 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:23.809524 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.809403 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r8r2p"] Apr 22 14:16:23.812534 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:23.812512 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca52a4a_534a_4287_b705_152b1ff23251.slice/crio-bcc221022d7dede3b9118aa312d3f89b0b67772ec532ecec3b19717d03af605e WatchSource:0}: Error finding container bcc221022d7dede3b9118aa312d3f89b0b67772ec532ecec3b19717d03af605e: Status 404 returned error can't find the container with id bcc221022d7dede3b9118aa312d3f89b0b67772ec532ecec3b19717d03af605e Apr 22 14:16:23.922958 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.922860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerStarted","Data":"981c2774d09d248ce404f9aa8f0480ebeaac7a25adf4495a430d529ea09539c2"} Apr 22 14:16:23.923888 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:23.923864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r8r2p" event={"ID":"5ca52a4a-534a-4287-b705-152b1ff23251","Type":"ContainerStarted","Data":"bcc221022d7dede3b9118aa312d3f89b0b67772ec532ecec3b19717d03af605e"} Apr 22 14:16:24.928467 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:24.928430 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f37f53f-e8b0-4392-a64b-126947d423d8" containerID="981c2774d09d248ce404f9aa8f0480ebeaac7a25adf4495a430d529ea09539c2" exitCode=0 Apr 22 14:16:24.928467 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:24.928477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerDied","Data":"981c2774d09d248ce404f9aa8f0480ebeaac7a25adf4495a430d529ea09539c2"} Apr 22 14:16:25.419502 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.419467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:25.419663 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.419510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:25.419663 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.419536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:25.419663 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419648 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:25.419813 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419676 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:25.419813 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419690 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-847f5f59fb-b8ckm: secret "image-registry-tls" not found Apr 22 14:16:25.419813 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419648 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:25.419813 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419732 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert podName:2ee495b6-b62e-473d-b4c4-23cc635944c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:29.419711228 +0000 UTC m=+40.222797183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert") pod "ingress-canary-hjl8j" (UID: "2ee495b6-b62e-473d-b4c4-23cc635944c3") : secret "canary-serving-cert" not found Apr 22 14:16:25.419813 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419804 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls podName:c50a20e2-d2f8-4020-ae37-2db6e8b8b778 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:29.419768957 +0000 UTC m=+40.222854908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls") pod "image-registry-847f5f59fb-b8ckm" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778") : secret "image-registry-tls" not found Apr 22 14:16:25.420121 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:25.419829 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls podName:98cf922e-f6c2-4a14-ad74-c33e7bf501df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:29.419817608 +0000 UTC m=+40.222903545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls") pod "dns-default-kh5ld" (UID: "98cf922e-f6c2-4a14-ad74-c33e7bf501df") : secret "dns-default-metrics-tls" not found Apr 22 14:16:25.621373 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.621305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:25.625160 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.625140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83a28155-8693-412d-81cc-13f4d8f8cf4b-original-pull-secret\") pod \"global-pull-secret-syncer-svgjx\" (UID: \"83a28155-8693-412d-81cc-13f4d8f8cf4b\") " pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:25.675891 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.675873 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgjx" Apr 22 14:16:25.933720 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.933639 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f37f53f-e8b0-4392-a64b-126947d423d8" containerID="17613ee54889e41608edcf910cda7a4d36d8f78133b9198ed329b34197510aaf" exitCode=0 Apr 22 14:16:25.934256 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:25.933717 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerDied","Data":"17613ee54889e41608edcf910cda7a4d36d8f78133b9198ed329b34197510aaf"} Apr 22 14:16:26.241367 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.241336 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t"] Apr 22 14:16:26.276252 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.276223 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t"] Apr 22 14:16:26.276394 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.276296 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" Apr 22 14:16:26.279278 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.279249 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 14:16:26.279438 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.279420 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 14:16:26.279689 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.279667 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-t4qqk\"" Apr 22 14:16:26.326673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.326513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f7jp\" (UniqueName: \"kubernetes.io/projected/97ec2540-b192-41b1-b9eb-09a5b0da42fd-kube-api-access-5f7jp\") pod \"migrator-74bb7799d9-v6j8t\" (UID: \"97ec2540-b192-41b1-b9eb-09a5b0da42fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" Apr 22 14:16:26.427495 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.427461 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f7jp\" (UniqueName: \"kubernetes.io/projected/97ec2540-b192-41b1-b9eb-09a5b0da42fd-kube-api-access-5f7jp\") pod \"migrator-74bb7799d9-v6j8t\" (UID: \"97ec2540-b192-41b1-b9eb-09a5b0da42fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" Apr 22 14:16:26.444614 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.444589 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f7jp\" (UniqueName: \"kubernetes.io/projected/97ec2540-b192-41b1-b9eb-09a5b0da42fd-kube-api-access-5f7jp\") pod \"migrator-74bb7799d9-v6j8t\" (UID: \"97ec2540-b192-41b1-b9eb-09a5b0da42fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" Apr 22 14:16:26.587253 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.587170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" Apr 22 14:16:26.828706 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.828677 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-svgjx"] Apr 22 14:16:26.833298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.833277 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t"] Apr 22 14:16:26.838089 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:26.838032 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a28155_8693_412d_81cc_13f4d8f8cf4b.slice/crio-47c900b88fc863f3c90102c90dab38dec9b94c6ec92f1766fd539a17cd069ca9 WatchSource:0}: Error finding container 47c900b88fc863f3c90102c90dab38dec9b94c6ec92f1766fd539a17cd069ca9: Status 404 returned error can't find the container with id 47c900b88fc863f3c90102c90dab38dec9b94c6ec92f1766fd539a17cd069ca9 Apr 22 14:16:26.838256 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:26.838237 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ec2540_b192_41b1_b9eb_09a5b0da42fd.slice/crio-8c27a283944069cfeff3e786f3bf5497b48003df69c2613e23567e5955a7750e WatchSource:0}: Error finding container 8c27a283944069cfeff3e786f3bf5497b48003df69c2613e23567e5955a7750e: Status 404 returned error can't find the container with id 8c27a283944069cfeff3e786f3bf5497b48003df69c2613e23567e5955a7750e Apr 22 14:16:26.936342 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.936310 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" event={"ID":"97ec2540-b192-41b1-b9eb-09a5b0da42fd","Type":"ContainerStarted","Data":"8c27a283944069cfeff3e786f3bf5497b48003df69c2613e23567e5955a7750e"} Apr 22 14:16:26.937171 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.937151 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-svgjx" event={"ID":"83a28155-8693-412d-81cc-13f4d8f8cf4b","Type":"ContainerStarted","Data":"47c900b88fc863f3c90102c90dab38dec9b94c6ec92f1766fd539a17cd069ca9"} Apr 22 14:16:26.939808 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.939789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" event={"ID":"2f37f53f-e8b0-4392-a64b-126947d423d8","Type":"ContainerStarted","Data":"49324e536cac66333a840afaf6c45e268612de60c20437cad0c646e73af175da"} Apr 22 14:16:26.966441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:26.966401 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zs8tx" podStartSLOduration=6.623108537 podStartE2EDuration="37.966389849s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.325339126 +0000 UTC m=+3.128425063" lastFinishedPulling="2026-04-22 14:16:23.668620427 +0000 UTC m=+34.471706375" observedRunningTime="2026-04-22 14:16:26.964373768 +0000 UTC m=+37.767459735" watchObservedRunningTime="2026-04-22 14:16:26.966389849 +0000 UTC m=+37.769475805" Apr 22 14:16:27.008697 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.008674 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7kbgh"] Apr 22 14:16:27.022494 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.022478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.025471 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.025456 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:16:27.025560 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.025494 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:16:27.025699 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.025687 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:16:27.025794 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.025775 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4htz2\"" Apr 22 14:16:27.025863 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.025817 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:16:27.030165 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.028255 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7kbgh"] Apr 22 14:16:27.032522 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.032500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d897aa63-fe04-4bf8-a7d6-57109b418895-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.032610 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.032530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d897aa63-fe04-4bf8-a7d6-57109b418895-crio-socket\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.032610 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.032591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d897aa63-fe04-4bf8-a7d6-57109b418895-data-volume\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.032718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.032668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.032718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.032703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6j6\" (UniqueName: \"kubernetes.io/projected/d897aa63-fe04-4bf8-a7d6-57109b418895-kube-api-access-qb6j6\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.132999 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.132978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133088 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6j6\" (UniqueName: \"kubernetes.io/projected/d897aa63-fe04-4bf8-a7d6-57109b418895-kube-api-access-qb6j6\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133088 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d897aa63-fe04-4bf8-a7d6-57109b418895-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133088 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d897aa63-fe04-4bf8-a7d6-57109b418895-crio-socket\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133228 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d897aa63-fe04-4bf8-a7d6-57109b418895-data-volume\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133228 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:27.133104 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.133228 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:27.133154 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls podName:d897aa63-fe04-4bf8-a7d6-57109b418895 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.63313733 +0000 UTC m=+38.436223265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7kbgh" (UID: "d897aa63-fe04-4bf8-a7d6-57109b418895") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.133378 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133310 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d897aa63-fe04-4bf8-a7d6-57109b418895-crio-socket\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133431 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d897aa63-fe04-4bf8-a7d6-57109b418895-data-volume\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.133629 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.133609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d897aa63-fe04-4bf8-a7d6-57109b418895-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.141424 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.141406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6j6\" (UniqueName: \"kubernetes.io/projected/d897aa63-fe04-4bf8-a7d6-57109b418895-kube-api-access-qb6j6\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.636571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.636535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:27.636753 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:27.636707 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.636820 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:27.636782 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls podName:d897aa63-fe04-4bf8-a7d6-57109b418895 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:28.636760394 +0000 UTC m=+39.439846354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7kbgh" (UID: "d897aa63-fe04-4bf8-a7d6-57109b418895") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.946505 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.946279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r8r2p" event={"ID":"5ca52a4a-534a-4287-b705-152b1ff23251","Type":"ContainerStarted","Data":"aaaff47953d0a25918e011693509df53d1c996842e531b70f38827ab4ce82783"} Apr 22 14:16:27.946932 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:27.946727 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:16:28.644383 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:28.644326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:28.644583 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:28.644566 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.644654 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:28.644640 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls podName:d897aa63-fe04-4bf8-a7d6-57109b418895 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.644615948 +0000 UTC m=+41.447701899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7kbgh" (UID: "d897aa63-fe04-4bf8-a7d6-57109b418895") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.949666 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:28.949582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" event={"ID":"97ec2540-b192-41b1-b9eb-09a5b0da42fd","Type":"ContainerStarted","Data":"5a8d3f36a88ff0f0dbaa860b8ba41bed5dd2251eb40d0b917ec7fc703d7075bd"} Apr 22 14:16:28.949666 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:28.949619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" event={"ID":"97ec2540-b192-41b1-b9eb-09a5b0da42fd","Type":"ContainerStarted","Data":"55113ccde0052ea734210111b4179ba2f7a22ddfd6a40c6b78c71e1943cd0db0"} Apr 22 14:16:28.968154 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:28.968113 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-r8r2p" podStartSLOduration=36.716919936 podStartE2EDuration="39.968101129s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:16:23.814310538 +0000 UTC m=+34.617396474" lastFinishedPulling="2026-04-22 14:16:27.065491729 +0000 UTC m=+37.868577667" observedRunningTime="2026-04-22 14:16:27.964720056 +0000 UTC m=+38.767806012" watchObservedRunningTime="2026-04-22 14:16:28.968101129 +0000 UTC m=+39.771187079" Apr 22 14:16:28.968708 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:28.968681 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v6j8t" podStartSLOduration=1.307015679 podStartE2EDuration="2.9686755s" podCreationTimestamp="2026-04-22 14:16:26 +0000 UTC" firstStartedPulling="2026-04-22 14:16:26.840246457 +0000 UTC m=+37.643332393" lastFinishedPulling="2026-04-22 14:16:28.501906262 +0000 UTC m=+39.304992214" observedRunningTime="2026-04-22 14:16:28.967536468 +0000 UTC m=+39.770622438" watchObservedRunningTime="2026-04-22 14:16:28.9686755 +0000 UTC m=+39.771761457" Apr 22 14:16:29.332959 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:29.332933 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v7x7t_05f8e253-60dd-45b7-a7a0-6117d9b61223/dns-node-resolver/0.log" Apr 22 14:16:29.450107 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:29.450078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:29.450275 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:29.450116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:29.450275 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:29.450147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:29.450275 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450232 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:29.450420 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450284 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:29.450420 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450298 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert podName:2ee495b6-b62e-473d-b4c4-23cc635944c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:37.450275925 +0000 UTC m=+48.253361882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert") pod "ingress-canary-hjl8j" (UID: "2ee495b6-b62e-473d-b4c4-23cc635944c3") : secret "canary-serving-cert" not found Apr 22 14:16:29.450420 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450333 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls podName:98cf922e-f6c2-4a14-ad74-c33e7bf501df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:37.450318046 +0000 UTC m=+48.253403985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls") pod "dns-default-kh5ld" (UID: "98cf922e-f6c2-4a14-ad74-c33e7bf501df") : secret "dns-default-metrics-tls" not found Apr 22 14:16:29.450420 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450287 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:29.450420 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450353 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-847f5f59fb-b8ckm: secret "image-registry-tls" not found Apr 22 14:16:29.450420 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:29.450389 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls podName:c50a20e2-d2f8-4020-ae37-2db6e8b8b778 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:37.450379878 +0000 UTC m=+48.253465817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls") pod "image-registry-847f5f59fb-b8ckm" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778") : secret "image-registry-tls" not found Apr 22 14:16:30.133089 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:30.133059 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvnzr_eb5dd517-9720-4de6-8414-93da58e2c537/node-ca/0.log" Apr 22 14:16:30.659437 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:30.659408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:30.659594 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:30.659542 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:30.659639 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:30.659616 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls podName:d897aa63-fe04-4bf8-a7d6-57109b418895 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:34.65959792 +0000 UTC m=+45.462683855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7kbgh" (UID: "d897aa63-fe04-4bf8-a7d6-57109b418895") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:31.332850 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:31.332815 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v6j8t_97ec2540-b192-41b1-b9eb-09a5b0da42fd/migrator/0.log" Apr 22 14:16:31.534053 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:31.534018 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v6j8t_97ec2540-b192-41b1-b9eb-09a5b0da42fd/graceful-termination/0.log" Apr 22 14:16:31.957232 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:31.957201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-svgjx" event={"ID":"83a28155-8693-412d-81cc-13f4d8f8cf4b","Type":"ContainerStarted","Data":"d17d67a453136dea118ae8cc4a804c38a87183e5e12960f75604a36295a050d4"} Apr 22 14:16:31.979399 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:31.979342 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-svgjx" podStartSLOduration=34.963979073 podStartE2EDuration="38.979328177s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:16:26.84008105 +0000 UTC m=+37.643166998" lastFinishedPulling="2026-04-22 14:16:30.855430166 +0000 UTC m=+41.658516102" observedRunningTime="2026-04-22 14:16:31.97923093 +0000 UTC m=+42.782316886" watchObservedRunningTime="2026-04-22 14:16:31.979328177 +0000 UTC m=+42.782414133" Apr 22 14:16:34.687352 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:34.687321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:34.687690 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:34.687454 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:34.687690 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:34.687529 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls podName:d897aa63-fe04-4bf8-a7d6-57109b418895 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:42.687512162 +0000 UTC m=+53.490598097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7kbgh" (UID: "d897aa63-fe04-4bf8-a7d6-57109b418895") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:37.503524 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.503483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:37.503524 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.503525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:37.504115 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.503551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:37.504115 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:37.503647 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:37.504115 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:37.503710 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert podName:2ee495b6-b62e-473d-b4c4-23cc635944c3 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:53.50369532 +0000 UTC m=+64.306781255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert") pod "ingress-canary-hjl8j" (UID: "2ee495b6-b62e-473d-b4c4-23cc635944c3") : secret "canary-serving-cert" not found Apr 22 14:16:37.504115 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:37.503650 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:37.504115 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:37.503752 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-847f5f59fb-b8ckm: secret "image-registry-tls" not found Apr 22 14:16:37.504115 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:16:37.503790 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls podName:c50a20e2-d2f8-4020-ae37-2db6e8b8b778 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:53.503779511 +0000 UTC m=+64.306865454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls") pod "image-registry-847f5f59fb-b8ckm" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778") : secret "image-registry-tls" not found Apr 22 14:16:37.506027 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.506007 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98cf922e-f6c2-4a14-ad74-c33e7bf501df-metrics-tls\") pod \"dns-default-kh5ld\" (UID: \"98cf922e-f6c2-4a14-ad74-c33e7bf501df\") " pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:37.540576 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.540549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:37.655458 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.655429 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kh5ld"] Apr 22 14:16:37.658570 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:37.658545 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cf922e_f6c2_4a14_ad74_c33e7bf501df.slice/crio-9edba61f5e9b020eabb1366d8739b404742b2c9801c5083ba63003469c3342ba WatchSource:0}: Error finding container 9edba61f5e9b020eabb1366d8739b404742b2c9801c5083ba63003469c3342ba: Status 404 returned error can't find the container with id 9edba61f5e9b020eabb1366d8739b404742b2c9801c5083ba63003469c3342ba Apr 22 14:16:37.970330 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:37.970250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kh5ld" event={"ID":"98cf922e-f6c2-4a14-ad74-c33e7bf501df","Type":"ContainerStarted","Data":"9edba61f5e9b020eabb1366d8739b404742b2c9801c5083ba63003469c3342ba"} Apr 22 14:16:39.975777 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:39.975750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kh5ld" event={"ID":"98cf922e-f6c2-4a14-ad74-c33e7bf501df","Type":"ContainerStarted","Data":"bad846930ac5009be374d77607b29f416cd2ffbf131058082cd4bdea80b1b0a0"} Apr 22 14:16:39.975777 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:39.975781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kh5ld" event={"ID":"98cf922e-f6c2-4a14-ad74-c33e7bf501df","Type":"ContainerStarted","Data":"3fc969ad365beabc414ddc6713219fcd02076259b494abe01bf0986143f713f2"} Apr 22 14:16:39.976139 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:39.975952 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:39.993028 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:39.992986 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kh5ld" podStartSLOduration=17.605634425 podStartE2EDuration="18.992973422s" podCreationTimestamp="2026-04-22 14:16:21 +0000 UTC" firstStartedPulling="2026-04-22 14:16:37.659934281 +0000 UTC m=+48.463020216" lastFinishedPulling="2026-04-22 14:16:39.047273278 +0000 UTC m=+49.850359213" observedRunningTime="2026-04-22 14:16:39.992421021 +0000 UTC m=+50.795506977" watchObservedRunningTime="2026-04-22 14:16:39.992973422 +0000 UTC m=+50.796059404" Apr 22 14:16:42.740756 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:42.740716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:42.742954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:42.742931 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d897aa63-fe04-4bf8-a7d6-57109b418895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7kbgh\" (UID: \"d897aa63-fe04-4bf8-a7d6-57109b418895\") " pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:42.935200 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:42.935162 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7kbgh" Apr 22 14:16:43.047543 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:43.047510 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7kbgh"] Apr 22 14:16:43.051244 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:43.051211 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd897aa63_fe04_4bf8_a7d6_57109b418895.slice/crio-b9c7ed64cb85c07450f5c2225f0c3756d8b56ee016ce003e85eba78245d45327 WatchSource:0}: Error finding container b9c7ed64cb85c07450f5c2225f0c3756d8b56ee016ce003e85eba78245d45327: Status 404 returned error can't find the container with id b9c7ed64cb85c07450f5c2225f0c3756d8b56ee016ce003e85eba78245d45327 Apr 22 14:16:43.989226 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:43.989197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7kbgh" event={"ID":"d897aa63-fe04-4bf8-a7d6-57109b418895","Type":"ContainerStarted","Data":"a891e9bdd806ae1eef5306bcb983121dccc22493e0b88f0aa6169decd484fec9"} Apr 22 14:16:43.989226 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:43.989230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7kbgh" event={"ID":"d897aa63-fe04-4bf8-a7d6-57109b418895","Type":"ContainerStarted","Data":"2fe2a2642f41693f2ff28affac49cd8a115344dff611707c5de6c329a50524cb"} Apr 22 14:16:43.989580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:43.989238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7kbgh" event={"ID":"d897aa63-fe04-4bf8-a7d6-57109b418895","Type":"ContainerStarted","Data":"b9c7ed64cb85c07450f5c2225f0c3756d8b56ee016ce003e85eba78245d45327"} Apr 22 14:16:45.910001 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:45.909938 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbkr7" Apr 22 14:16:45.995212 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:45.995181 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7kbgh" event={"ID":"d897aa63-fe04-4bf8-a7d6-57109b418895","Type":"ContainerStarted","Data":"7fba5b85dea08a2b68b161222985229bf8a062847a8373566b796b4bbfcce1e4"} Apr 22 14:16:46.017846 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:46.017803 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7kbgh" podStartSLOduration=17.471334257 podStartE2EDuration="20.017791739s" podCreationTimestamp="2026-04-22 14:16:26 +0000 UTC" firstStartedPulling="2026-04-22 14:16:43.105584264 +0000 UTC m=+53.908670200" lastFinishedPulling="2026-04-22 14:16:45.652041748 +0000 UTC m=+56.455127682" observedRunningTime="2026-04-22 14:16:46.017221311 +0000 UTC m=+56.820307267" watchObservedRunningTime="2026-04-22 14:16:46.017791739 +0000 UTC m=+56.820877693" Apr 22 14:16:49.981274 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:49.981247 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kh5ld" Apr 22 14:16:53.515416 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.515372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:53.515416 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.515421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:53.517715 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.517679 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee495b6-b62e-473d-b4c4-23cc635944c3-cert\") pod \"ingress-canary-hjl8j\" (UID: \"2ee495b6-b62e-473d-b4c4-23cc635944c3\") " pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:53.517856 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.517731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"image-registry-847f5f59fb-b8ckm\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:53.731019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.730992 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fd7vw\"" Apr 22 14:16:53.738309 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.738291 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:53.760560 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.760538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9tgv\"" Apr 22 14:16:53.765207 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.765192 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hjl8j" Apr 22 14:16:53.865386 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.865353 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-847f5f59fb-b8ckm"] Apr 22 14:16:53.868061 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:53.868029 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50a20e2_d2f8_4020_ae37_2db6e8b8b778.slice/crio-f3244c89da2f3c093b94c61d3b0ffc027f360c3b8ac94e8b3b1c609e725ccb8a WatchSource:0}: Error finding container f3244c89da2f3c093b94c61d3b0ffc027f360c3b8ac94e8b3b1c609e725ccb8a: Status 404 returned error can't find the container with id f3244c89da2f3c093b94c61d3b0ffc027f360c3b8ac94e8b3b1c609e725ccb8a Apr 22 14:16:53.886352 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:53.886248 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hjl8j"] Apr 22 14:16:53.888613 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:53.888581 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee495b6_b62e_473d_b4c4_23cc635944c3.slice/crio-1b823078178cc2326c7668f1a3943e5715a81eaa4376eba979da6788127176da WatchSource:0}: Error finding container 1b823078178cc2326c7668f1a3943e5715a81eaa4376eba979da6788127176da: Status 404 returned error can't find the container with id 1b823078178cc2326c7668f1a3943e5715a81eaa4376eba979da6788127176da Apr 22 14:16:54.015812 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:54.015730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hjl8j" event={"ID":"2ee495b6-b62e-473d-b4c4-23cc635944c3","Type":"ContainerStarted","Data":"1b823078178cc2326c7668f1a3943e5715a81eaa4376eba979da6788127176da"} Apr 22 14:16:54.017024 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:54.017000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" event={"ID":"c50a20e2-d2f8-4020-ae37-2db6e8b8b778","Type":"ContainerStarted","Data":"739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c"} Apr 22 14:16:54.017130 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:54.017028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" event={"ID":"c50a20e2-d2f8-4020-ae37-2db6e8b8b778","Type":"ContainerStarted","Data":"f3244c89da2f3c093b94c61d3b0ffc027f360c3b8ac94e8b3b1c609e725ccb8a"} Apr 22 14:16:54.017175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:54.017134 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:16:54.038994 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:54.038952 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" podStartSLOduration=71.038939822 podStartE2EDuration="1m11.038939822s" podCreationTimestamp="2026-04-22 14:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:16:54.038188072 +0000 UTC m=+64.841274028" watchObservedRunningTime="2026-04-22 14:16:54.038939822 +0000 UTC m=+64.842025778" Apr 22 14:16:55.429772 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:55.429739 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:55.432013 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:55.431982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce08d471-53a2-45d6-8243-b4455eddce85-metrics-certs\") pod \"network-metrics-daemon-f984g\" (UID: \"ce08d471-53a2-45d6-8243-b4455eddce85\") " pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:55.685877 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:55.685787 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4bnk6\"" Apr 22 14:16:55.693162 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:55.693147 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f984g" Apr 22 14:16:55.806213 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:55.806186 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f984g"] Apr 22 14:16:55.809168 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:16:55.809131 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce08d471_53a2_45d6_8243_b4455eddce85.slice/crio-4e77b37ac2bec686d1e49385506282b4de4a8eb37c6868a57d0794f290b55dbe WatchSource:0}: Error finding container 4e77b37ac2bec686d1e49385506282b4de4a8eb37c6868a57d0794f290b55dbe: Status 404 returned error can't find the container with id 4e77b37ac2bec686d1e49385506282b4de4a8eb37c6868a57d0794f290b55dbe Apr 22 14:16:56.022742 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:56.022711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f984g" event={"ID":"ce08d471-53a2-45d6-8243-b4455eddce85","Type":"ContainerStarted","Data":"4e77b37ac2bec686d1e49385506282b4de4a8eb37c6868a57d0794f290b55dbe"} Apr 22 14:16:56.023889 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:56.023866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hjl8j" event={"ID":"2ee495b6-b62e-473d-b4c4-23cc635944c3","Type":"ContainerStarted","Data":"c08851ba8d918f6d830a149aac7f2b177070e9fc981c10294f48636efc0f00f7"} Apr 22 14:16:58.030936 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:58.030901 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f984g" event={"ID":"ce08d471-53a2-45d6-8243-b4455eddce85","Type":"ContainerStarted","Data":"abd80b9f7f31cc6b955b478b8385a834a2ab83128ceef1126806f5e5b3111b3c"} Apr 22 14:16:58.030936 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:58.030941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f984g" event={"ID":"ce08d471-53a2-45d6-8243-b4455eddce85","Type":"ContainerStarted","Data":"51063b505dd73562ff8815cbec85dd06f27cb9e4726408dc1e841b6646abc643"} Apr 22 14:16:58.049387 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:58.049339 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f984g" podStartSLOduration=67.782714682 podStartE2EDuration="1m9.049324928s" podCreationTimestamp="2026-04-22 14:15:49 +0000 UTC" firstStartedPulling="2026-04-22 14:16:55.81100829 +0000 UTC m=+66.614094225" lastFinishedPulling="2026-04-22 14:16:57.077618532 +0000 UTC m=+67.880704471" observedRunningTime="2026-04-22 14:16:58.047633368 +0000 UTC m=+68.850719324" watchObservedRunningTime="2026-04-22 14:16:58.049324928 +0000 UTC m=+68.852410884" Apr 22 14:16:58.049830 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:58.049799 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hjl8j" podStartSLOduration=35.627251379 podStartE2EDuration="37.049791563s" podCreationTimestamp="2026-04-22 14:16:21 +0000 UTC" firstStartedPulling="2026-04-22 14:16:53.891629824 +0000 UTC m=+64.694715759" lastFinishedPulling="2026-04-22 14:16:55.314169994 +0000 UTC m=+66.117255943" observedRunningTime="2026-04-22 14:16:56.0423019 +0000 UTC m=+66.845387858" watchObservedRunningTime="2026-04-22 14:16:58.049791563 +0000 UTC m=+68.852877519" Apr 22 14:16:59.955220 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:16:59.955194 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-r8r2p" Apr 22 14:17:03.343651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.343621 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c7cd749cb-rtz5l"] Apr 22 14:17:03.357789 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.357759 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7cd749cb-rtz5l"] Apr 22 14:17:03.357925 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.357821 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.361166 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.361145 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:17:03.361298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.361173 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:17:03.361298 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.361206 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:17:03.362495 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.362474 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:17:03.362619 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.362529 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bhrwc\"" Apr 22 14:17:03.362619 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.362518 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:17:03.362619 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.362491 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:17:03.362619 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.362488 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:17:03.367092 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.367072 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 14:17:03.478903 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.478874 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-serving-cert\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.478903 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.478902 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-service-ca\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.479086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.478920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6nqk\" (UniqueName: \"kubernetes.io/projected/3744c521-e4ef-4e9f-99f2-19f20e4a3356-kube-api-access-l6nqk\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.479086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.478939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-config\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.479086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.478972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-oauth-config\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.479086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.479038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-trusted-ca-bundle\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.479086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.479061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-oauth-serving-cert\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580263 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-serving-cert\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-service-ca\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6nqk\" (UniqueName: \"kubernetes.io/projected/3744c521-e4ef-4e9f-99f2-19f20e4a3356-kube-api-access-l6nqk\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-config\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-oauth-config\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-trusted-ca-bundle\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-oauth-serving-cert\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580938 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-config\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.580991 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.580976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-service-ca\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.581066 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.581049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-trusted-ca-bundle\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.590770 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.590746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-oauth-serving-cert\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.592635 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.592607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-serving-cert\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.592712 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.592633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6nqk\" (UniqueName: \"kubernetes.io/projected/3744c521-e4ef-4e9f-99f2-19f20e4a3356-kube-api-access-l6nqk\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.603079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.603033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-oauth-config\") pod \"console-6c7cd749cb-rtz5l\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.667654 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.667632 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:03.779138 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:03.776973 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7cd749cb-rtz5l"] Apr 22 14:17:03.780314 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:03.780288 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3744c521_e4ef_4e9f_99f2_19f20e4a3356.slice/crio-b4eafd9b1bfa552063b32a6805ec37dc692bb1781b01e70d86ed354a4369267f WatchSource:0}: Error finding container b4eafd9b1bfa552063b32a6805ec37dc692bb1781b01e70d86ed354a4369267f: Status 404 returned error can't find the container with id b4eafd9b1bfa552063b32a6805ec37dc692bb1781b01e70d86ed354a4369267f Apr 22 14:17:04.048268 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:04.048232 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7cd749cb-rtz5l" event={"ID":"3744c521-e4ef-4e9f-99f2-19f20e4a3356","Type":"ContainerStarted","Data":"b4eafd9b1bfa552063b32a6805ec37dc692bb1781b01e70d86ed354a4369267f"} Apr 22 14:17:07.057322 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:07.057286 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7cd749cb-rtz5l" event={"ID":"3744c521-e4ef-4e9f-99f2-19f20e4a3356","Type":"ContainerStarted","Data":"4f2e1a4668e7405c0d930d2b1774ea6bd1a3f78079a8dfad84b9a75c91884721"} Apr 22 14:17:07.077142 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:07.077099 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c7cd749cb-rtz5l" podStartSLOduration=1.33561166 podStartE2EDuration="4.077084965s" podCreationTimestamp="2026-04-22 14:17:03 +0000 UTC" firstStartedPulling="2026-04-22 14:17:03.795885829 +0000 UTC m=+74.598971764" lastFinishedPulling="2026-04-22 14:17:06.537359126 +0000 UTC m=+77.340445069" observedRunningTime="2026-04-22 14:17:07.075818803 +0000 UTC m=+77.878904784" watchObservedRunningTime="2026-04-22 14:17:07.077084965 +0000 UTC m=+77.880170921" Apr 22 14:17:13.668694 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:13.668657 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:13.669133 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:13.668708 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:13.673208 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:13.673187 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:13.828877 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:13.828827 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-847f5f59fb-b8ckm"] Apr 22 14:17:13.832583 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:13.832562 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:17:14.079852 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:14.079818 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:17:16.371611 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.371577 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g"] Apr 22 14:17:16.374443 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.374428 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:16.377358 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.377339 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 14:17:16.377436 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.377410 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-c284v\"" Apr 22 14:17:16.383178 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.383159 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g"] Apr 22 14:17:16.469237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.469206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0cf5c90d-03ee-47c0-9928-ef11da808610-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jk48g\" (UID: \"0cf5c90d-03ee-47c0-9928-ef11da808610\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:16.570257 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.570234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0cf5c90d-03ee-47c0-9928-ef11da808610-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jk48g\" (UID: \"0cf5c90d-03ee-47c0-9928-ef11da808610\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:16.573144 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.573118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0cf5c90d-03ee-47c0-9928-ef11da808610-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jk48g\" (UID: \"0cf5c90d-03ee-47c0-9928-ef11da808610\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:16.683405 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.683337 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:16.800417 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:16.800384 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g"] Apr 22 14:17:16.803074 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:16.803047 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf5c90d_03ee_47c0_9928_ef11da808610.slice/crio-a572440dae44ad4c9ad7e3e8d26e5d5ff237d41516135c9bfc156d50a2740a03 WatchSource:0}: Error finding container a572440dae44ad4c9ad7e3e8d26e5d5ff237d41516135c9bfc156d50a2740a03: Status 404 returned error can't find the container with id a572440dae44ad4c9ad7e3e8d26e5d5ff237d41516135c9bfc156d50a2740a03 Apr 22 14:17:17.083814 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:17.083783 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" event={"ID":"0cf5c90d-03ee-47c0-9928-ef11da808610","Type":"ContainerStarted","Data":"a572440dae44ad4c9ad7e3e8d26e5d5ff237d41516135c9bfc156d50a2740a03"} Apr 22 14:17:19.089703 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:19.089672 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" event={"ID":"0cf5c90d-03ee-47c0-9928-ef11da808610","Type":"ContainerStarted","Data":"e3fdd5f291e88081a2b1da1544d9409c15de5f2f09fd942428272715f998f01d"} Apr 22 14:17:19.090082 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:19.089907 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:19.094660 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:19.094637 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" Apr 22 14:17:19.111952 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:19.111893 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jk48g" podStartSLOduration=1.868699934 podStartE2EDuration="3.111879738s" podCreationTimestamp="2026-04-22 14:17:16 +0000 UTC" firstStartedPulling="2026-04-22 14:17:16.804769518 +0000 UTC m=+87.607855453" lastFinishedPulling="2026-04-22 14:17:18.047949316 +0000 UTC m=+88.851035257" observedRunningTime="2026-04-22 14:17:19.110591149 +0000 UTC m=+89.913677129" watchObservedRunningTime="2026-04-22 14:17:19.111879738 +0000 UTC m=+89.914965696" Apr 22 14:17:25.024294 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.024260 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd"] Apr 22 14:17:25.026749 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.026734 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.030944 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.030917 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 14:17:25.031152 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.031134 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:17:25.032323 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.032304 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 14:17:25.032323 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.032311 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:17:25.032461 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.032364 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-rb25q\"" Apr 22 14:17:25.032889 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.032638 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:17:25.044859 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.044826 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hg6mm"] Apr 22 14:17:25.046888 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.046875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.053813 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.053794 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd"] Apr 22 14:17:25.054082 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.054063 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:17:25.054356 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.054340 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:17:25.054433 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.054344 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:17:25.054484 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.054437 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s69rl\"" Apr 22 14:17:25.123665 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123637 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.123811 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123669 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-accelerators-collector-config\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.123811 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps742\" (UniqueName: \"kubernetes.io/projected/51a5a4df-f45c-48f8-b0bb-8605d0d627be-kube-api-access-ps742\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.123811 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-wtmp\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.123811 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-sys\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.123811 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-textfile\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.123811 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123797 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-tls\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.124068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-root\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.124068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51a5a4df-f45c-48f8-b0bb-8605d0d627be-metrics-client-ca\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.124068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59a394f8-6982-4782-9e92-b0377ea4a655-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.124068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl4s4\" (UniqueName: \"kubernetes.io/projected/59a394f8-6982-4782-9e92-b0377ea4a655-kube-api-access-vl4s4\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.124068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123947 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.124068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.123983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.224510 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59a394f8-6982-4782-9e92-b0377ea4a655-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.224620 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl4s4\" (UniqueName: \"kubernetes.io/projected/59a394f8-6982-4782-9e92-b0377ea4a655-kube-api-access-vl4s4\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.224620 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.224620 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.224620 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.224799 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:17:25.224668 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 14:17:25.224799 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-accelerators-collector-config\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.224799 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:17:25.224735 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls podName:59a394f8-6982-4782-9e92-b0377ea4a655 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:25.724713626 +0000 UTC m=+96.527799577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8bwkd" (UID: "59a394f8-6982-4782-9e92-b0377ea4a655") : secret "openshift-state-metrics-tls" not found Apr 22 14:17:25.224799 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps742\" (UniqueName: \"kubernetes.io/projected/51a5a4df-f45c-48f8-b0bb-8605d0d627be-kube-api-access-ps742\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-wtmp\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-sys\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-textfile\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-tls\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-root\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51a5a4df-f45c-48f8-b0bb-8605d0d627be-metrics-client-ca\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225019 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.224983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-sys\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225334 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.225129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-wtmp\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225334 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.225258 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-textfile\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225334 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.225292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59a394f8-6982-4782-9e92-b0377ea4a655-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.225334 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.225316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/51a5a4df-f45c-48f8-b0bb-8605d0d627be-root\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225484 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.225354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-accelerators-collector-config\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.225626 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.225606 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51a5a4df-f45c-48f8-b0bb-8605d0d627be-metrics-client-ca\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.226944 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.226924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.227039 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.227023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.227139 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.227120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a5a4df-f45c-48f8-b0bb-8605d0d627be-node-exporter-tls\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.239142 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.239115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps742\" (UniqueName: \"kubernetes.io/projected/51a5a4df-f45c-48f8-b0bb-8605d0d627be-kube-api-access-ps742\") pod \"node-exporter-hg6mm\" (UID: \"51a5a4df-f45c-48f8-b0bb-8605d0d627be\") " pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.241968 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.241945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl4s4\" (UniqueName: \"kubernetes.io/projected/59a394f8-6982-4782-9e92-b0377ea4a655-kube-api-access-vl4s4\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.358230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.358157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hg6mm" Apr 22 14:17:25.367983 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:25.367953 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a5a4df_f45c_48f8_b0bb_8605d0d627be.slice/crio-b5eb8f9c5c202b46ff46a103bc37d464a91fd8e27c7e6572387f70d820b9f831 WatchSource:0}: Error finding container b5eb8f9c5c202b46ff46a103bc37d464a91fd8e27c7e6572387f70d820b9f831: Status 404 returned error can't find the container with id b5eb8f9c5c202b46ff46a103bc37d464a91fd8e27c7e6572387f70d820b9f831 Apr 22 14:17:25.729370 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:25.729287 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:25.729508 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:17:25.729428 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 14:17:25.729508 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:17:25.729490 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls podName:59a394f8-6982-4782-9e92-b0377ea4a655 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:26.729474913 +0000 UTC m=+97.532560849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8bwkd" (UID: "59a394f8-6982-4782-9e92-b0377ea4a655") : secret "openshift-state-metrics-tls" not found Apr 22 14:17:26.106910 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.106868 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hg6mm" event={"ID":"51a5a4df-f45c-48f8-b0bb-8605d0d627be","Type":"ContainerStarted","Data":"b5eb8f9c5c202b46ff46a103bc37d464a91fd8e27c7e6572387f70d820b9f831"} Apr 22 14:17:26.737273 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.737244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:26.739454 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.739434 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59a394f8-6982-4782-9e92-b0377ea4a655-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8bwkd\" (UID: \"59a394f8-6982-4782-9e92-b0377ea4a655\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:26.835281 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.835254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" Apr 22 14:17:26.927934 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.927907 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5499b445bb-tln7b"] Apr 22 14:17:26.931031 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.931012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:26.935578 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.935556 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 14:17:26.935742 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.935724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 14:17:26.935851 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.935804 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-c64qb\"" Apr 22 14:17:26.936590 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.936568 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 14:17:26.936694 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.936613 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 14:17:26.936694 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.936658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-16be41eaniuns\"" Apr 22 14:17:26.936787 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.936736 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 14:17:26.944790 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.944770 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5499b445bb-tln7b"] Apr 22 14:17:26.957702 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:26.957684 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd"] Apr 22 14:17:26.960529 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:26.960501 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a394f8_6982_4782_9e92_b0377ea4a655.slice/crio-366c0aac23dbb735cf0602caf5dc93bec8c920218e017aebc074fdd985c4452f WatchSource:0}: Error finding container 366c0aac23dbb735cf0602caf5dc93bec8c920218e017aebc074fdd985c4452f: Status 404 returned error can't find the container with id 366c0aac23dbb735cf0602caf5dc93bec8c920218e017aebc074fdd985c4452f Apr 22 14:17:27.039115 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcd7284-1f97-413e-9b26-7705617a2be5-metrics-client-ca\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039223 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-grpc-tls\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039223 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039327 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039327 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039394 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5cz\" (UniqueName: \"kubernetes.io/projected/3dcd7284-1f97-413e-9b26-7705617a2be5-kube-api-access-8v5cz\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039394 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039347 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.039394 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.039374 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-tls\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.110254 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.110219 2579 generic.go:358] "Generic (PLEG): container finished" podID="51a5a4df-f45c-48f8-b0bb-8605d0d627be" containerID="0229cd21680ab73643f0f33b3952a3d8101d0fdefdb5b6a5ff4f34e8d1b3d0a3" exitCode=0 Apr 22 14:17:27.110630 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.110307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hg6mm" event={"ID":"51a5a4df-f45c-48f8-b0bb-8605d0d627be","Type":"ContainerDied","Data":"0229cd21680ab73643f0f33b3952a3d8101d0fdefdb5b6a5ff4f34e8d1b3d0a3"} Apr 22 14:17:27.111708 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.111688 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" event={"ID":"59a394f8-6982-4782-9e92-b0377ea4a655","Type":"ContainerStarted","Data":"00fef7b755d9c554ec3956719f6b9bf7d888c95999b2e1f22ca30156247b612f"} Apr 22 14:17:27.111708 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.111715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" event={"ID":"59a394f8-6982-4782-9e92-b0377ea4a655","Type":"ContainerStarted","Data":"a736218adee1f0a7adeafa71b88a1de639bbb172713361ec3aa602ea16d06a21"} Apr 22 14:17:27.111825 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.111729 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" event={"ID":"59a394f8-6982-4782-9e92-b0377ea4a655","Type":"ContainerStarted","Data":"366c0aac23dbb735cf0602caf5dc93bec8c920218e017aebc074fdd985c4452f"} Apr 22 14:17:27.140439 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-tls\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcd7284-1f97-413e-9b26-7705617a2be5-metrics-client-ca\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-grpc-tls\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140518 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140669 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140669 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140669 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5cz\" (UniqueName: \"kubernetes.io/projected/3dcd7284-1f97-413e-9b26-7705617a2be5-kube-api-access-8v5cz\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.140669 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.140641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.141228 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.141203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dcd7284-1f97-413e-9b26-7705617a2be5-metrics-client-ca\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.143022 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.142990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.143112 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.143051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-grpc-tls\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.143332 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.143312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.143409 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.143338 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.143461 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.143420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.143594 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.143465 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3dcd7284-1f97-413e-9b26-7705617a2be5-secret-thanos-querier-tls\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.153571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.153549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5cz\" (UniqueName: \"kubernetes.io/projected/3dcd7284-1f97-413e-9b26-7705617a2be5-kube-api-access-8v5cz\") pod \"thanos-querier-5499b445bb-tln7b\" (UID: \"3dcd7284-1f97-413e-9b26-7705617a2be5\") " pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.240733 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.240708 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:27.365584 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:27.365547 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5499b445bb-tln7b"] Apr 22 14:17:27.369177 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:27.369144 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd7284_1f97_413e_9b26_7705617a2be5.slice/crio-3ed23f9027ab134b438b5ca3a9fecfa703d317f4bf6b22f3105744e224b53524 WatchSource:0}: Error finding container 3ed23f9027ab134b438b5ca3a9fecfa703d317f4bf6b22f3105744e224b53524: Status 404 returned error can't find the container with id 3ed23f9027ab134b438b5ca3a9fecfa703d317f4bf6b22f3105744e224b53524 Apr 22 14:17:28.116928 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:28.116874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"3ed23f9027ab134b438b5ca3a9fecfa703d317f4bf6b22f3105744e224b53524"} Apr 22 14:17:28.119079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:28.119038 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hg6mm" event={"ID":"51a5a4df-f45c-48f8-b0bb-8605d0d627be","Type":"ContainerStarted","Data":"ae75a2d894872ba5a9f2e5746d994d6a485f6e006b98cbba2d64debec52aa0f8"} Apr 22 14:17:28.119079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:28.119075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hg6mm" event={"ID":"51a5a4df-f45c-48f8-b0bb-8605d0d627be","Type":"ContainerStarted","Data":"a7b3369a1e517b6985abef8cbd8818998fd5cfd8112fe3ce90470355f40c137b"} Apr 22 14:17:28.143954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:28.143893 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hg6mm" podStartSLOduration=2.349954343 podStartE2EDuration="3.143878597s" podCreationTimestamp="2026-04-22 14:17:25 +0000 UTC" firstStartedPulling="2026-04-22 14:17:25.369442744 +0000 UTC m=+96.172528679" lastFinishedPulling="2026-04-22 14:17:26.163366982 +0000 UTC m=+96.966452933" observedRunningTime="2026-04-22 14:17:28.141788391 +0000 UTC m=+98.944874344" watchObservedRunningTime="2026-04-22 14:17:28.143878597 +0000 UTC m=+98.946964557" Apr 22 14:17:29.124267 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.124215 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" event={"ID":"59a394f8-6982-4782-9e92-b0377ea4a655","Type":"ContainerStarted","Data":"6c229e0ab38181c068251e13932a902fc9c90cb862136f4ea3cc2afc5035d45f"} Apr 22 14:17:29.147485 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.147416 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8bwkd" podStartSLOduration=4.058227564 podStartE2EDuration="5.147403695s" podCreationTimestamp="2026-04-22 14:17:24 +0000 UTC" firstStartedPulling="2026-04-22 14:17:27.067691112 +0000 UTC m=+97.870777047" lastFinishedPulling="2026-04-22 14:17:28.156867225 +0000 UTC m=+98.959953178" observedRunningTime="2026-04-22 14:17:29.145820346 +0000 UTC m=+99.948906321" watchObservedRunningTime="2026-04-22 14:17:29.147403695 +0000 UTC m=+99.950489652" Apr 22 14:17:29.746153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.746124 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6"] Apr 22 14:17:29.748529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.748513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:29.753873 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.753848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 14:17:29.754910 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.754892 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5vs9z\"" Apr 22 14:17:29.772982 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.772960 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6"] Apr 22 14:17:29.865545 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.865516 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2043c022-20b9-4167-9f2b-b4197849da3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rr2n6\" (UID: \"2043c022-20b9-4167-9f2b-b4197849da3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:29.966228 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.966188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2043c022-20b9-4167-9f2b-b4197849da3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rr2n6\" (UID: \"2043c022-20b9-4167-9f2b-b4197849da3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:29.968418 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:29.968393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2043c022-20b9-4167-9f2b-b4197849da3b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rr2n6\" (UID: \"2043c022-20b9-4167-9f2b-b4197849da3b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:30.057026 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.056960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:30.130186 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.130118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"39eb2a07b644afa896d7c363efa61a28a5477a9a865c17939e5044748d22fa3e"} Apr 22 14:17:30.130186 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.130161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"8ca7c60e4657c7609359fa7258309f422eb844ce4179d8ca336a542e82b0b696"} Apr 22 14:17:30.130186 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.130170 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"90a133159a5e6635777d721fdb1fd60674ca361b9b8c174601148d0e7672d6dc"} Apr 22 14:17:30.191857 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.191812 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-867d7b45c-dq2gk"] Apr 22 14:17:30.194533 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.194510 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.198661 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.198518 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6"] Apr 22 14:17:30.199577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.199289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 14:17:30.199577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.199316 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 14:17:30.199577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.199371 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 14:17:30.199577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.199384 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 14:17:30.199577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.199459 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-cvj7t\"" Apr 22 14:17:30.199883 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.199788 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 14:17:30.204729 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.204709 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 14:17:30.209896 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.209859 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-867d7b45c-dq2gk"] Apr 22 14:17:30.269234 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-secret-telemeter-client\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269262 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-telemeter-trusted-ca-bundle\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-metrics-client-ca\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269494 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-serving-certs-ca-bundle\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269494 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269467 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-telemeter-client-tls\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269583 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fcx\" (UniqueName: \"kubernetes.io/projected/8db7bb95-61b6-4001-9976-00de970e2b49-kube-api-access-h7fcx\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.269709 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.269627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-federate-client-tls\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.303343 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:30.303313 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2043c022_20b9_4167_9f2b_b4197849da3b.slice/crio-d5861192c8b02098e0f8c0744943ca41647f123b9f1b20b3368178036a706d54 WatchSource:0}: Error finding container d5861192c8b02098e0f8c0744943ca41647f123b9f1b20b3368178036a706d54: Status 404 returned error can't find the container with id d5861192c8b02098e0f8c0744943ca41647f123b9f1b20b3368178036a706d54 Apr 22 14:17:30.370078 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-federate-client-tls\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-secret-telemeter-client\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-telemeter-trusted-ca-bundle\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-metrics-client-ca\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-serving-certs-ca-bundle\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-telemeter-client-tls\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.370368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.370253 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fcx\" (UniqueName: \"kubernetes.io/projected/8db7bb95-61b6-4001-9976-00de970e2b49-kube-api-access-h7fcx\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.371288 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.371168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-serving-certs-ca-bundle\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.371288 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.371262 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-metrics-client-ca\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.371288 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.371284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db7bb95-61b6-4001-9976-00de970e2b49-telemeter-trusted-ca-bundle\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.372667 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.372630 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-federate-client-tls\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.372988 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.372970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.373171 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.373152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-secret-telemeter-client\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.373257 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.373238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8db7bb95-61b6-4001-9976-00de970e2b49-telemeter-client-tls\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.380221 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.380191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fcx\" (UniqueName: \"kubernetes.io/projected/8db7bb95-61b6-4001-9976-00de970e2b49-kube-api-access-h7fcx\") pod \"telemeter-client-867d7b45c-dq2gk\" (UID: \"8db7bb95-61b6-4001-9976-00de970e2b49\") " pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.504712 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.504688 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" Apr 22 14:17:30.624222 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:30.624192 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-867d7b45c-dq2gk"] Apr 22 14:17:30.626961 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:30.626936 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db7bb95_61b6_4001_9976_00de970e2b49.slice/crio-e70958f3542c12bc3158661a950cfabd5ef713ceee9c4d3e815d99d4806541d6 WatchSource:0}: Error finding container e70958f3542c12bc3158661a950cfabd5ef713ceee9c4d3e815d99d4806541d6: Status 404 returned error can't find the container with id e70958f3542c12bc3158661a950cfabd5ef713ceee9c4d3e815d99d4806541d6 Apr 22 14:17:31.136395 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.136360 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"bcfccbfe7feec8d5610af169e4035ba3b7073182ecc2cdf43a7a8774e507b1d4"} Apr 22 14:17:31.136942 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.136400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"809436e23e8b3eb3505f7a9385883234b6181d5a460b3a0689ce7da5bcf0cb30"} Apr 22 14:17:31.136942 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.136414 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" event={"ID":"3dcd7284-1f97-413e-9b26-7705617a2be5","Type":"ContainerStarted","Data":"9cb6e9b8c184752582d3a11d2312f34850e7b78fc54c45fccc4a6b215e9e74a0"} Apr 22 14:17:31.136942 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.136529 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:31.137599 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.137576 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" event={"ID":"8db7bb95-61b6-4001-9976-00de970e2b49","Type":"ContainerStarted","Data":"e70958f3542c12bc3158661a950cfabd5ef713ceee9c4d3e815d99d4806541d6"} Apr 22 14:17:31.138625 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.138592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" event={"ID":"2043c022-20b9-4167-9f2b-b4197849da3b","Type":"ContainerStarted","Data":"d5861192c8b02098e0f8c0744943ca41647f123b9f1b20b3368178036a706d54"} Apr 22 14:17:31.169143 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.169091 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" podStartSLOduration=2.2081047209999998 podStartE2EDuration="5.169075616s" podCreationTimestamp="2026-04-22 14:17:26 +0000 UTC" firstStartedPulling="2026-04-22 14:17:27.371074799 +0000 UTC m=+98.174160741" lastFinishedPulling="2026-04-22 14:17:30.332045701 +0000 UTC m=+101.135131636" observedRunningTime="2026-04-22 14:17:31.166771229 +0000 UTC m=+101.969857185" watchObservedRunningTime="2026-04-22 14:17:31.169075616 +0000 UTC m=+101.972161573" Apr 22 14:17:31.186954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.186930 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:17:31.189510 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.189496 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.193805 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.193787 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 14:17:31.194122 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 14:17:31.194213 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194106 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 14:17:31.194213 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194162 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 14:17:31.194386 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194368 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 14:17:31.194492 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194428 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 14:17:31.194552 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194502 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 14:17:31.194617 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.194607 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 14:17:31.195440 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.195409 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:17:31.195575 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.195559 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vn8r2\"" Apr 22 14:17:31.195631 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.195594 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 14:17:31.195631 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.195605 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 14:17:31.195930 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.195915 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-aeugfc2hf3dd4\"" Apr 22 14:17:31.201932 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.201908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 14:17:31.202862 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.202823 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 14:17:31.224716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.224678 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:17:31.276806 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.276777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.276925 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.276819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.276925 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.276872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.276925 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.276904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.276969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqpbs\" (UniqueName: \"kubernetes.io/projected/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-kube-api-access-vqpbs\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-web-config\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-config-out\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277633 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-config\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.277698 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.277638 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378185 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-web-config\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-config-out\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-config\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.378651 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.379216 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378659 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.379216 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.378689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqpbs\" (UniqueName: \"kubernetes.io/projected/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-kube-api-access-vqpbs\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.379315 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.379237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.381642 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.381617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.381767 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.381671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-config\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.382269 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.381956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.382579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.382551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.384746 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.384365 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.384746 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.384560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.384746 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.384633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.384975 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.384819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-config-out\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.384975 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.384963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.385478 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.385421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.385693 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.385673 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.386586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.386471 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.387236 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.387178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.387410 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.387391 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.387883 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.387861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-web-config\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.388602 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.388580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqpbs\" (UniqueName: \"kubernetes.io/projected/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-kube-api-access-vqpbs\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.389961 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.389913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d0fedaf-c65e-4389-b9e9-e4e607a788d7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4d0fedaf-c65e-4389-b9e9-e4e607a788d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.499042 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.499012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:31.793473 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:31.793443 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:17:31.794721 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:17:31.794683 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d0fedaf_c65e_4389_b9e9_e4e607a788d7.slice/crio-cf18af643693468e4f353e6a76080f22bc46b1148821e102a398fc5860857519 WatchSource:0}: Error finding container cf18af643693468e4f353e6a76080f22bc46b1148821e102a398fc5860857519: Status 404 returned error can't find the container with id cf18af643693468e4f353e6a76080f22bc46b1148821e102a398fc5860857519 Apr 22 14:17:32.144681 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:32.144639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" event={"ID":"2043c022-20b9-4167-9f2b-b4197849da3b","Type":"ContainerStarted","Data":"fed079c051d67b0217b585800ebbeee4fef0097d5b0dd4671bb3890350dd701a"} Apr 22 14:17:32.145162 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:32.144869 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:32.146174 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:32.146147 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"cf18af643693468e4f353e6a76080f22bc46b1148821e102a398fc5860857519"} Apr 22 14:17:32.150424 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:32.150404 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" Apr 22 14:17:32.163163 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:32.163122 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rr2n6" podStartSLOduration=1.789090895 podStartE2EDuration="3.163110653s" podCreationTimestamp="2026-04-22 14:17:29 +0000 UTC" firstStartedPulling="2026-04-22 14:17:30.305089117 +0000 UTC m=+101.108175052" lastFinishedPulling="2026-04-22 14:17:31.679108872 +0000 UTC m=+102.482194810" observedRunningTime="2026-04-22 14:17:32.161885863 +0000 UTC m=+102.964971824" watchObservedRunningTime="2026-04-22 14:17:32.163110653 +0000 UTC m=+102.966196609" Apr 22 14:17:33.151039 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:33.150963 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" event={"ID":"8db7bb95-61b6-4001-9976-00de970e2b49","Type":"ContainerStarted","Data":"886acf48f5db38a1b3bea2eb5f699f5789b166a15604f2656eb936795dd8395a"} Apr 22 14:17:33.151039 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:33.150999 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" event={"ID":"8db7bb95-61b6-4001-9976-00de970e2b49","Type":"ContainerStarted","Data":"c9fa07ea43e87b13d947c17ed79dec605630ca0d3d65be64ab8ac2c3316d742e"} Apr 22 14:17:33.151039 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:33.151008 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" event={"ID":"8db7bb95-61b6-4001-9976-00de970e2b49","Type":"ContainerStarted","Data":"f446b825238494ee87aebaebef0c14fb04d67fbcd4111b1c20f2413a052b8085"} Apr 22 14:17:33.152357 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:33.152334 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d0fedaf-c65e-4389-b9e9-e4e607a788d7" containerID="56cefacfe81a2a4591a7781bd5a8620fba25db8a2117fc84e396cd21db9d42d8" exitCode=0 Apr 22 14:17:33.152478 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:33.152422 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerDied","Data":"56cefacfe81a2a4591a7781bd5a8620fba25db8a2117fc84e396cd21db9d42d8"} Apr 22 14:17:33.177898 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:33.177853 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-867d7b45c-dq2gk" podStartSLOduration=0.947099961 podStartE2EDuration="3.177819523s" podCreationTimestamp="2026-04-22 14:17:30 +0000 UTC" firstStartedPulling="2026-04-22 14:17:30.628710865 +0000 UTC m=+101.431796802" lastFinishedPulling="2026-04-22 14:17:32.859430428 +0000 UTC m=+103.662516364" observedRunningTime="2026-04-22 14:17:33.175756469 +0000 UTC m=+103.978842425" watchObservedRunningTime="2026-04-22 14:17:33.177819523 +0000 UTC m=+103.980905479" Apr 22 14:17:36.163658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.163628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"2d091ad5f10541bcca2d7b208b7993d46eb399ca3ab0a44a98d359e098d2b8e3"} Apr 22 14:17:36.163658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.163665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"87a32587ee2b47c0a9b8623ce4f1414b0fe493dab0db4ac177f01d2124b95feb"} Apr 22 14:17:36.164111 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.163680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"028705c9dce66b66d5ee96e483177566284a0818c48004c62f893f3694bb7f64"} Apr 22 14:17:36.164111 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.163691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"8ddb0e7457324e72b33a128d5984d7c4bdb0e222f0a275b0c96d8bf4849210dd"} Apr 22 14:17:36.164111 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.163701 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"67ee069f2176341ee78ea9b2b2cb37dbba3b5426fe698787fcb309c9a193455f"} Apr 22 14:17:36.164111 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.163713 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d0fedaf-c65e-4389-b9e9-e4e607a788d7","Type":"ContainerStarted","Data":"d846162848063d2dc721b68be8d4cf8d7199e66bab9afcfc0c31d70580e06bc1"} Apr 22 14:17:36.203389 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.203336 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.298285641 podStartE2EDuration="5.203324007s" podCreationTimestamp="2026-04-22 14:17:31 +0000 UTC" firstStartedPulling="2026-04-22 14:17:31.796716984 +0000 UTC m=+102.599802922" lastFinishedPulling="2026-04-22 14:17:35.701755349 +0000 UTC m=+106.504841288" observedRunningTime="2026-04-22 14:17:36.202549753 +0000 UTC m=+107.005635709" watchObservedRunningTime="2026-04-22 14:17:36.203324007 +0000 UTC m=+107.006410008" Apr 22 14:17:36.499971 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:36.499927 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:17:37.153718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:37.153696 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5499b445bb-tln7b" Apr 22 14:17:38.846785 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:38.846740 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" podUID="c50a20e2-d2f8-4020-ae37-2db6e8b8b778" containerName="registry" containerID="cri-o://739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c" gracePeriod=30 Apr 22 14:17:39.082473 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.082451 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:17:39.171977 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.171904 2579 generic.go:358] "Generic (PLEG): container finished" podID="c50a20e2-d2f8-4020-ae37-2db6e8b8b778" containerID="739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c" exitCode=0 Apr 22 14:17:39.172116 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.171978 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" Apr 22 14:17:39.172116 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.171994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" event={"ID":"c50a20e2-d2f8-4020-ae37-2db6e8b8b778","Type":"ContainerDied","Data":"739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c"} Apr 22 14:17:39.172116 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.172037 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-847f5f59fb-b8ckm" event={"ID":"c50a20e2-d2f8-4020-ae37-2db6e8b8b778","Type":"ContainerDied","Data":"f3244c89da2f3c093b94c61d3b0ffc027f360c3b8ac94e8b3b1c609e725ccb8a"} Apr 22 14:17:39.172116 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.172068 2579 scope.go:117] "RemoveContainer" containerID="739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c" Apr 22 14:17:39.179407 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.179392 2579 scope.go:117] "RemoveContainer" containerID="739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c" Apr 22 14:17:39.179684 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:17:39.179662 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c\": container with ID starting with 739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c not found: ID does not exist" containerID="739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c" Apr 22 14:17:39.179780 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.179687 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c"} err="failed to get container status \"739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c\": rpc error: code = NotFound desc = could not find container \"739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c\": container with ID starting with 739b9ee3d8c4d9aaf232a48704f85e2087f7ede8a124568cb1d02b518c8a330c not found: ID does not exist" Apr 22 14:17:39.249360 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249334 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-bound-sa-token\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249473 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249414 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-certificates\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249473 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249460 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-installation-pull-secrets\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249572 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249496 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-image-registry-private-configuration\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249572 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249525 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-trusted-ca\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249572 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249550 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-ca-trust-extracted\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249734 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249579 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpn6g\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-kube-api-access-fpn6g\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.249734 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249617 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") pod \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\" (UID: \"c50a20e2-d2f8-4020-ae37-2db6e8b8b778\") " Apr 22 14:17:39.250029 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.249935 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:39.250140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.250050 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:39.251871 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.251816 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:39.252014 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.251974 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:39.252083 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.252021 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:39.252135 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.252093 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:39.252135 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.252101 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-kube-api-access-fpn6g" (OuterVolumeSpecName: "kube-api-access-fpn6g") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "kube-api-access-fpn6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:39.257929 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.257902 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c50a20e2-d2f8-4020-ae37-2db6e8b8b778" (UID: "c50a20e2-d2f8-4020-ae37-2db6e8b8b778"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:17:39.351125 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351096 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-bound-sa-token\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351125 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351123 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-certificates\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351139 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-installation-pull-secrets\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351154 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-image-registry-private-configuration\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351167 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-trusted-ca\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351179 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-ca-trust-extracted\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351191 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpn6g\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-kube-api-access-fpn6g\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.351277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.351203 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50a20e2-d2f8-4020-ae37-2db6e8b8b778-registry-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.494718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.494686 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-847f5f59fb-b8ckm"] Apr 22 14:17:39.499490 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.499467 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-847f5f59fb-b8ckm"] Apr 22 14:17:39.760512 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:39.760481 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50a20e2-d2f8-4020-ae37-2db6e8b8b778" path="/var/lib/kubelet/pods/c50a20e2-d2f8-4020-ae37-2db6e8b8b778/volumes" Apr 22 14:17:45.071765 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:17:45.071730 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7cd749cb-rtz5l"] Apr 22 14:18:10.089403 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.089346 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c7cd749cb-rtz5l" podUID="3744c521-e4ef-4e9f-99f2-19f20e4a3356" containerName="console" containerID="cri-o://4f2e1a4668e7405c0d930d2b1774ea6bd1a3f78079a8dfad84b9a75c91884721" gracePeriod=15 Apr 22 14:18:10.280229 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.280054 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7cd749cb-rtz5l_3744c521-e4ef-4e9f-99f2-19f20e4a3356/console/0.log" Apr 22 14:18:10.280229 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.280106 2579 generic.go:358] "Generic (PLEG): container finished" podID="3744c521-e4ef-4e9f-99f2-19f20e4a3356" containerID="4f2e1a4668e7405c0d930d2b1774ea6bd1a3f78079a8dfad84b9a75c91884721" exitCode=2 Apr 22 14:18:10.280229 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.280175 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7cd749cb-rtz5l" event={"ID":"3744c521-e4ef-4e9f-99f2-19f20e4a3356","Type":"ContainerDied","Data":"4f2e1a4668e7405c0d930d2b1774ea6bd1a3f78079a8dfad84b9a75c91884721"} Apr 22 14:18:10.350466 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.350440 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7cd749cb-rtz5l_3744c521-e4ef-4e9f-99f2-19f20e4a3356/console/0.log" Apr 22 14:18:10.350577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.350495 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:18:10.488505 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488471 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-config\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488518 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-oauth-config\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488543 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-oauth-serving-cert\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488588 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6nqk\" (UniqueName: \"kubernetes.io/projected/3744c521-e4ef-4e9f-99f2-19f20e4a3356-kube-api-access-l6nqk\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488607 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-trusted-ca-bundle\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488640 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-service-ca\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488925 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488689 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-serving-cert\") pod \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\" (UID: \"3744c521-e4ef-4e9f-99f2-19f20e4a3356\") " Apr 22 14:18:10.488925 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488823 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-config" (OuterVolumeSpecName: "console-config") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:10.489031 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488964 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-config\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.489031 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.488980 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:10.489112 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.489029 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:10.489410 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.489378 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-service-ca" (OuterVolumeSpecName: "service-ca") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:10.490922 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.490896 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:10.490922 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.490914 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:10.491045 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.490916 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3744c521-e4ef-4e9f-99f2-19f20e4a3356-kube-api-access-l6nqk" (OuterVolumeSpecName: "kube-api-access-l6nqk") pod "3744c521-e4ef-4e9f-99f2-19f20e4a3356" (UID: "3744c521-e4ef-4e9f-99f2-19f20e4a3356"). InnerVolumeSpecName "kube-api-access-l6nqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:10.589673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.589604 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-service-ca\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.589673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.589634 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-serving-cert\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.589673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.589645 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3744c521-e4ef-4e9f-99f2-19f20e4a3356-console-oauth-config\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.589673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.589654 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-oauth-serving-cert\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.589673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.589663 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l6nqk\" (UniqueName: \"kubernetes.io/projected/3744c521-e4ef-4e9f-99f2-19f20e4a3356-kube-api-access-l6nqk\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.589673 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:10.589672 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3744c521-e4ef-4e9f-99f2-19f20e4a3356-trusted-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:18:11.283709 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.283685 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7cd749cb-rtz5l_3744c521-e4ef-4e9f-99f2-19f20e4a3356/console/0.log" Apr 22 14:18:11.284126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.283774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7cd749cb-rtz5l" event={"ID":"3744c521-e4ef-4e9f-99f2-19f20e4a3356","Type":"ContainerDied","Data":"b4eafd9b1bfa552063b32a6805ec37dc692bb1781b01e70d86ed354a4369267f"} Apr 22 14:18:11.284126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.283796 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7cd749cb-rtz5l" Apr 22 14:18:11.284126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.283812 2579 scope.go:117] "RemoveContainer" containerID="4f2e1a4668e7405c0d930d2b1774ea6bd1a3f78079a8dfad84b9a75c91884721" Apr 22 14:18:11.315310 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.315281 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7cd749cb-rtz5l"] Apr 22 14:18:11.327304 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.327281 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c7cd749cb-rtz5l"] Apr 22 14:18:11.764559 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:11.764522 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3744c521-e4ef-4e9f-99f2-19f20e4a3356" path="/var/lib/kubelet/pods/3744c521-e4ef-4e9f-99f2-19f20e4a3356/volumes" Apr 22 14:18:31.499499 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:31.499462 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:31.517556 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:31.517533 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:32.355541 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:18:32.355515 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:20:49.651581 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:20:49.651552 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:22:53.811321 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811293 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl"] Apr 22 14:22:53.811713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811586 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c50a20e2-d2f8-4020-ae37-2db6e8b8b778" containerName="registry" Apr 22 14:22:53.811713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811597 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50a20e2-d2f8-4020-ae37-2db6e8b8b778" containerName="registry" Apr 22 14:22:53.811713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811610 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3744c521-e4ef-4e9f-99f2-19f20e4a3356" containerName="console" Apr 22 14:22:53.811713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811615 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3744c521-e4ef-4e9f-99f2-19f20e4a3356" containerName="console" Apr 22 14:22:53.811713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811663 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c50a20e2-d2f8-4020-ae37-2db6e8b8b778" containerName="registry" Apr 22 14:22:53.811713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.811673 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3744c521-e4ef-4e9f-99f2-19f20e4a3356" containerName="console" Apr 22 14:22:53.814675 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.814653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:53.817742 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.817720 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 14:22:53.818090 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.818075 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:22:53.819455 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.819436 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-lm5bq\"" Apr 22 14:22:53.819552 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.819489 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:22:53.829421 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.829400 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl"] Apr 22 14:22:53.841249 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.841230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610dd975-29e1-440e-98a8-b8ad71b73274-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:53.841345 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.841268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbkv\" (UniqueName: \"kubernetes.io/projected/610dd975-29e1-440e-98a8-b8ad71b73274-kube-api-access-xcbkv\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:53.942559 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.942531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610dd975-29e1-440e-98a8-b8ad71b73274-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:53.942730 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.942589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbkv\" (UniqueName: \"kubernetes.io/projected/610dd975-29e1-440e-98a8-b8ad71b73274-kube-api-access-xcbkv\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:53.942794 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:22:53.942754 2579 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 14:22:53.942851 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:22:53.942816 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/610dd975-29e1-440e-98a8-b8ad71b73274-cert podName:610dd975-29e1-440e-98a8-b8ad71b73274 nodeName:}" failed. No retries permitted until 2026-04-22 14:22:54.442796765 +0000 UTC m=+425.245882705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/610dd975-29e1-440e-98a8-b8ad71b73274-cert") pod "llmisvc-controller-manager-68cc5db7c4-9t9rl" (UID: "610dd975-29e1-440e-98a8-b8ad71b73274") : secret "llmisvc-webhook-server-cert" not found Apr 22 14:22:53.955800 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:53.955771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbkv\" (UniqueName: \"kubernetes.io/projected/610dd975-29e1-440e-98a8-b8ad71b73274-kube-api-access-xcbkv\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:54.447808 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:54.447775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610dd975-29e1-440e-98a8-b8ad71b73274-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:54.450084 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:54.450057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610dd975-29e1-440e-98a8-b8ad71b73274-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9t9rl\" (UID: \"610dd975-29e1-440e-98a8-b8ad71b73274\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:54.724452 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:54.724382 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:22:54.847075 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:54.847051 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl"] Apr 22 14:22:54.848979 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:22:54.848953 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod610dd975_29e1_440e_98a8_b8ad71b73274.slice/crio-389104293dcde6125675d03a60fdf8fc6078b1823463d84a3615c6115784f34d WatchSource:0}: Error finding container 389104293dcde6125675d03a60fdf8fc6078b1823463d84a3615c6115784f34d: Status 404 returned error can't find the container with id 389104293dcde6125675d03a60fdf8fc6078b1823463d84a3615c6115784f34d Apr 22 14:22:54.850222 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:54.850204 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:22:55.034972 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:55.034945 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" event={"ID":"610dd975-29e1-440e-98a8-b8ad71b73274","Type":"ContainerStarted","Data":"389104293dcde6125675d03a60fdf8fc6078b1823463d84a3615c6115784f34d"} Apr 22 14:22:57.042303 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:57.042210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" event={"ID":"610dd975-29e1-440e-98a8-b8ad71b73274","Type":"ContainerStarted","Data":"9ddf267d1fa691046632e76a78670883b30a31f6081ee36f510fcc42af36b62c"} Apr 22 14:22:57.042671 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:22:57.042310 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:23:28.047261 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:23:28.047229 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" Apr 22 14:23:28.067892 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:23:28.067817 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9t9rl" podStartSLOduration=33.151434873 podStartE2EDuration="35.067802575s" podCreationTimestamp="2026-04-22 14:22:53 +0000 UTC" firstStartedPulling="2026-04-22 14:22:54.850319414 +0000 UTC m=+425.653405349" lastFinishedPulling="2026-04-22 14:22:56.766687105 +0000 UTC m=+427.569773051" observedRunningTime="2026-04-22 14:22:57.071674077 +0000 UTC m=+427.874760033" watchObservedRunningTime="2026-04-22 14:23:28.067802575 +0000 UTC m=+458.870888535" Apr 22 14:24:19.061111 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.061077 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-rq7bs"] Apr 22 14:24:19.064427 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.064411 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rq7bs" Apr 22 14:24:19.067536 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.067516 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vmlfk\"" Apr 22 14:24:19.067657 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.067516 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:24:19.071456 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.071434 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rq7bs"] Apr 22 14:24:19.165094 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.165062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4grm\" (UniqueName: \"kubernetes.io/projected/1ae6367c-74d7-4ec3-8923-c5d1947551d8-kube-api-access-v4grm\") pod \"s3-init-rq7bs\" (UID: \"1ae6367c-74d7-4ec3-8923-c5d1947551d8\") " pod="kserve/s3-init-rq7bs" Apr 22 14:24:19.265536 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.265507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4grm\" (UniqueName: \"kubernetes.io/projected/1ae6367c-74d7-4ec3-8923-c5d1947551d8-kube-api-access-v4grm\") pod \"s3-init-rq7bs\" (UID: \"1ae6367c-74d7-4ec3-8923-c5d1947551d8\") " pod="kserve/s3-init-rq7bs" Apr 22 14:24:19.275217 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.275196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4grm\" (UniqueName: \"kubernetes.io/projected/1ae6367c-74d7-4ec3-8923-c5d1947551d8-kube-api-access-v4grm\") pod \"s3-init-rq7bs\" (UID: \"1ae6367c-74d7-4ec3-8923-c5d1947551d8\") " pod="kserve/s3-init-rq7bs" Apr 22 14:24:19.385013 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.384944 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rq7bs" Apr 22 14:24:19.499775 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:19.499743 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rq7bs"] Apr 22 14:24:19.502909 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:24:19.502881 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ae6367c_74d7_4ec3_8923_c5d1947551d8.slice/crio-974b5be63ef170e1418160f721fc54dbdd3921ee6a1634990855793e62affa82 WatchSource:0}: Error finding container 974b5be63ef170e1418160f721fc54dbdd3921ee6a1634990855793e62affa82: Status 404 returned error can't find the container with id 974b5be63ef170e1418160f721fc54dbdd3921ee6a1634990855793e62affa82 Apr 22 14:24:20.267665 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:20.267619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rq7bs" event={"ID":"1ae6367c-74d7-4ec3-8923-c5d1947551d8","Type":"ContainerStarted","Data":"974b5be63ef170e1418160f721fc54dbdd3921ee6a1634990855793e62affa82"} Apr 22 14:24:24.281419 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:24.281382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rq7bs" event={"ID":"1ae6367c-74d7-4ec3-8923-c5d1947551d8","Type":"ContainerStarted","Data":"a8701765d8382bf621821fdb77b137b1bdaf3cfb51cf075c65264ec365dcfad9"} Apr 22 14:24:24.303099 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:24.303054 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-rq7bs" podStartSLOduration=0.96040006 podStartE2EDuration="5.303041177s" podCreationTimestamp="2026-04-22 14:24:19 +0000 UTC" firstStartedPulling="2026-04-22 14:24:19.504757078 +0000 UTC m=+510.307843016" lastFinishedPulling="2026-04-22 14:24:23.847398194 +0000 UTC m=+514.650484133" observedRunningTime="2026-04-22 14:24:24.300884073 +0000 UTC m=+515.103970029" watchObservedRunningTime="2026-04-22 14:24:24.303041177 +0000 UTC m=+515.106127131" Apr 22 14:24:27.292158 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:27.292126 2579 generic.go:358] "Generic (PLEG): container finished" podID="1ae6367c-74d7-4ec3-8923-c5d1947551d8" containerID="a8701765d8382bf621821fdb77b137b1bdaf3cfb51cf075c65264ec365dcfad9" exitCode=0 Apr 22 14:24:27.292498 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:27.292187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rq7bs" event={"ID":"1ae6367c-74d7-4ec3-8923-c5d1947551d8","Type":"ContainerDied","Data":"a8701765d8382bf621821fdb77b137b1bdaf3cfb51cf075c65264ec365dcfad9"} Apr 22 14:24:28.412095 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:28.412073 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rq7bs" Apr 22 14:24:28.437974 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:28.437948 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4grm\" (UniqueName: \"kubernetes.io/projected/1ae6367c-74d7-4ec3-8923-c5d1947551d8-kube-api-access-v4grm\") pod \"1ae6367c-74d7-4ec3-8923-c5d1947551d8\" (UID: \"1ae6367c-74d7-4ec3-8923-c5d1947551d8\") " Apr 22 14:24:28.440056 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:28.440026 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae6367c-74d7-4ec3-8923-c5d1947551d8-kube-api-access-v4grm" (OuterVolumeSpecName: "kube-api-access-v4grm") pod "1ae6367c-74d7-4ec3-8923-c5d1947551d8" (UID: "1ae6367c-74d7-4ec3-8923-c5d1947551d8"). InnerVolumeSpecName "kube-api-access-v4grm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:28.539367 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:28.539338 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v4grm\" (UniqueName: \"kubernetes.io/projected/1ae6367c-74d7-4ec3-8923-c5d1947551d8-kube-api-access-v4grm\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:24:29.298350 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:29.298312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rq7bs" event={"ID":"1ae6367c-74d7-4ec3-8923-c5d1947551d8","Type":"ContainerDied","Data":"974b5be63ef170e1418160f721fc54dbdd3921ee6a1634990855793e62affa82"} Apr 22 14:24:29.298350 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:29.298337 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rq7bs" Apr 22 14:24:29.298601 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:29.298345 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="974b5be63ef170e1418160f721fc54dbdd3921ee6a1634990855793e62affa82" Apr 22 14:24:39.476417 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.476344 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh"] Apr 22 14:24:39.476756 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.476637 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ae6367c-74d7-4ec3-8923-c5d1947551d8" containerName="s3-init" Apr 22 14:24:39.476756 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.476647 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae6367c-74d7-4ec3-8923-c5d1947551d8" containerName="s3-init" Apr 22 14:24:39.476756 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.476700 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ae6367c-74d7-4ec3-8923-c5d1947551d8" containerName="s3-init" Apr 22 14:24:39.479490 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.479475 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" Apr 22 14:24:39.482428 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.482408 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-n4vmn\"" Apr 22 14:24:39.488367 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.488347 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" Apr 22 14:24:39.489657 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.489635 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh"] Apr 22 14:24:39.584936 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.584908 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8"] Apr 22 14:24:39.588584 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.588553 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:24:39.598767 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.598743 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8"] Apr 22 14:24:39.615148 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.615041 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh"] Apr 22 14:24:39.617715 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:24:39.617692 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6569b4b3_4678_49b0_86c5_21d7a839c64f.slice/crio-2843b4673d8588d57c5a54d498eed16cae9803c12a9fcfbe12b981101fca83c4 WatchSource:0}: Error finding container 2843b4673d8588d57c5a54d498eed16cae9803c12a9fcfbe12b981101fca83c4: Status 404 returned error can't find the container with id 2843b4673d8588d57c5a54d498eed16cae9803c12a9fcfbe12b981101fca83c4 Apr 22 14:24:39.719480 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.719456 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/458e541b-5615-4eef-813b-df55c095f0cf-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-s8pc8\" (UID: \"458e541b-5615-4eef-813b-df55c095f0cf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:24:39.820539 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.820511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/458e541b-5615-4eef-813b-df55c095f0cf-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-s8pc8\" (UID: \"458e541b-5615-4eef-813b-df55c095f0cf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:24:39.820852 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.820814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/458e541b-5615-4eef-813b-df55c095f0cf-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-s8pc8\" (UID: \"458e541b-5615-4eef-813b-df55c095f0cf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:24:39.898938 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:39.898911 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:24:40.019797 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:40.019775 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8"] Apr 22 14:24:40.021508 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:24:40.021480 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e WatchSource:0}: Error finding container 6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e: Status 404 returned error can't find the container with id 6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e Apr 22 14:24:40.330784 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:40.330745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" event={"ID":"6569b4b3-4678-49b0-86c5-21d7a839c64f","Type":"ContainerStarted","Data":"2843b4673d8588d57c5a54d498eed16cae9803c12a9fcfbe12b981101fca83c4"} Apr 22 14:24:40.332751 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:40.332722 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" event={"ID":"458e541b-5615-4eef-813b-df55c095f0cf","Type":"ContainerStarted","Data":"6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e"} Apr 22 14:24:53.379511 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:53.379478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" event={"ID":"6569b4b3-4678-49b0-86c5-21d7a839c64f","Type":"ContainerStarted","Data":"6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a"} Apr 22 14:24:53.379931 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:53.379679 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" Apr 22 14:24:53.380999 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:53.380974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" event={"ID":"458e541b-5615-4eef-813b-df55c095f0cf","Type":"ContainerStarted","Data":"2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09"} Apr 22 14:24:53.381209 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:53.381171 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 14:24:53.396882 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:53.396847 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podStartSLOduration=0.727953578 podStartE2EDuration="14.3968226s" podCreationTimestamp="2026-04-22 14:24:39 +0000 UTC" firstStartedPulling="2026-04-22 14:24:39.620508015 +0000 UTC m=+530.423593950" lastFinishedPulling="2026-04-22 14:24:53.289377034 +0000 UTC m=+544.092462972" observedRunningTime="2026-04-22 14:24:53.39604747 +0000 UTC m=+544.199133447" watchObservedRunningTime="2026-04-22 14:24:53.3968226 +0000 UTC m=+544.199908589" Apr 22 14:24:54.383617 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:54.383579 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 14:24:57.392189 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:57.392158 2579 generic.go:358] "Generic (PLEG): container finished" podID="458e541b-5615-4eef-813b-df55c095f0cf" containerID="2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09" exitCode=0 Apr 22 14:24:57.392519 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:24:57.392233 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" event={"ID":"458e541b-5615-4eef-813b-df55c095f0cf","Type":"ContainerDied","Data":"2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09"} Apr 22 14:25:04.383890 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:04.383829 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 14:25:14.383929 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:14.383878 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 14:25:19.458783 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:19.458748 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" event={"ID":"458e541b-5615-4eef-813b-df55c095f0cf","Type":"ContainerStarted","Data":"d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68"} Apr 22 14:25:19.459175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:19.459041 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:25:19.460196 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:19.460172 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:25:19.478481 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:19.478443 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podStartSLOduration=1.538090631 podStartE2EDuration="40.47843209s" podCreationTimestamp="2026-04-22 14:24:39 +0000 UTC" firstStartedPulling="2026-04-22 14:24:40.02342938 +0000 UTC m=+530.826515316" lastFinishedPulling="2026-04-22 14:25:18.963770837 +0000 UTC m=+569.766856775" observedRunningTime="2026-04-22 14:25:19.477102402 +0000 UTC m=+570.280188381" watchObservedRunningTime="2026-04-22 14:25:19.47843209 +0000 UTC m=+570.281518050" Apr 22 14:25:20.462312 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:20.462275 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:25:24.384551 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:24.384512 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 14:25:30.462434 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:30.462391 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:25:34.384229 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:34.384186 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 14:25:40.462659 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:40.462610 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:25:44.385007 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:44.384974 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" Apr 22 14:25:50.463296 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:50.463250 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:25:59.287177 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.287146 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr"] Apr 22 14:25:59.304894 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.304868 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr"] Apr 22 14:25:59.305051 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.304966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.307777 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.307756 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:25:59.307777 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.307776 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1cfdf-kube-rbac-proxy-sar-config\"" Apr 22 14:25:59.307976 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.307761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1cfdf-serving-cert\"" Apr 22 14:25:59.466322 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.466289 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-proxy-tls\") pod \"switch-graph-1cfdf-5887c7bb65-mq9nr\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.466470 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.466332 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-openshift-service-ca-bundle\") pod \"switch-graph-1cfdf-5887c7bb65-mq9nr\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.567125 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.567043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-proxy-tls\") pod \"switch-graph-1cfdf-5887c7bb65-mq9nr\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.567125 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.567099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-openshift-service-ca-bundle\") pod \"switch-graph-1cfdf-5887c7bb65-mq9nr\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.567745 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.567722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-openshift-service-ca-bundle\") pod \"switch-graph-1cfdf-5887c7bb65-mq9nr\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.569389 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.569348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-proxy-tls\") pod \"switch-graph-1cfdf-5887c7bb65-mq9nr\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.614863 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.614825 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:25:59.732686 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:25:59.732658 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr"] Apr 22 14:25:59.735673 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:25:59.735643 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod733ef929_3e43_46f7_80b0_1b8a0d2bcd13.slice/crio-7694e2d43a5b1865cd4c942bc067bd0f00f0442409ff4fc2e88b2084547a8315 WatchSource:0}: Error finding container 7694e2d43a5b1865cd4c942bc067bd0f00f0442409ff4fc2e88b2084547a8315: Status 404 returned error can't find the container with id 7694e2d43a5b1865cd4c942bc067bd0f00f0442409ff4fc2e88b2084547a8315 Apr 22 14:26:00.463343 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:00.463296 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:26:00.574150 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:00.574114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" event={"ID":"733ef929-3e43-46f7-80b0-1b8a0d2bcd13","Type":"ContainerStarted","Data":"7694e2d43a5b1865cd4c942bc067bd0f00f0442409ff4fc2e88b2084547a8315"} Apr 22 14:26:02.581610 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:02.581577 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" event={"ID":"733ef929-3e43-46f7-80b0-1b8a0d2bcd13","Type":"ContainerStarted","Data":"07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4"} Apr 22 14:26:02.581996 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:02.581634 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:26:02.600714 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:02.600671 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podStartSLOduration=1.324403454 podStartE2EDuration="3.600656903s" podCreationTimestamp="2026-04-22 14:25:59 +0000 UTC" firstStartedPulling="2026-04-22 14:25:59.737384915 +0000 UTC m=+610.540470851" lastFinishedPulling="2026-04-22 14:26:02.013638348 +0000 UTC m=+612.816724300" observedRunningTime="2026-04-22 14:26:02.599592159 +0000 UTC m=+613.402678174" watchObservedRunningTime="2026-04-22 14:26:02.600656903 +0000 UTC m=+613.403742910" Apr 22 14:26:08.590823 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:08.590753 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:26:09.456292 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.456262 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr"] Apr 22 14:26:09.456464 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.456445 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" containerID="cri-o://07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4" gracePeriod=30 Apr 22 14:26:09.618140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.618113 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh"] Apr 22 14:26:09.618559 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.618389 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" containerID="cri-o://6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a" gracePeriod=30 Apr 22 14:26:09.752365 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.752340 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t"] Apr 22 14:26:09.755517 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.755501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" Apr 22 14:26:09.764654 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.764637 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" Apr 22 14:26:09.769566 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.769527 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t"] Apr 22 14:26:09.897324 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:09.897299 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t"] Apr 22 14:26:09.899426 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:26:09.899399 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c12b754_7e62_456f_bb3d_f1d1940235c9.slice/crio-3f3d448ef3985c1430ae2b224b44f199653df1e25f7202da71dd192ac72a6b17 WatchSource:0}: Error finding container 3f3d448ef3985c1430ae2b224b44f199653df1e25f7202da71dd192ac72a6b17: Status 404 returned error can't find the container with id 3f3d448ef3985c1430ae2b224b44f199653df1e25f7202da71dd192ac72a6b17 Apr 22 14:26:10.462430 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:10.462389 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:26:10.606465 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:10.606422 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" event={"ID":"9c12b754-7e62-456f-bb3d-f1d1940235c9","Type":"ContainerStarted","Data":"ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416"} Apr 22 14:26:10.606465 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:10.606465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" event={"ID":"9c12b754-7e62-456f-bb3d-f1d1940235c9","Type":"ContainerStarted","Data":"3f3d448ef3985c1430ae2b224b44f199653df1e25f7202da71dd192ac72a6b17"} Apr 22 14:26:10.606656 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:10.606552 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" Apr 22 14:26:10.607729 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:10.607704 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 14:26:10.623762 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:10.623723 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podStartSLOduration=1.623711183 podStartE2EDuration="1.623711183s" podCreationTimestamp="2026-04-22 14:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:26:10.62263816 +0000 UTC m=+621.425724119" watchObservedRunningTime="2026-04-22 14:26:10.623711183 +0000 UTC m=+621.426797181" Apr 22 14:26:11.609600 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:11.609564 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 14:26:13.150306 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.150286 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" Apr 22 14:26:13.588685 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.588647 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:13.615555 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.615526 2579 generic.go:358] "Generic (PLEG): container finished" podID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerID="6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a" exitCode=0 Apr 22 14:26:13.615708 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.615590 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" Apr 22 14:26:13.615708 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.615593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" event={"ID":"6569b4b3-4678-49b0-86c5-21d7a839c64f","Type":"ContainerDied","Data":"6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a"} Apr 22 14:26:13.615708 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.615703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh" event={"ID":"6569b4b3-4678-49b0-86c5-21d7a839c64f","Type":"ContainerDied","Data":"2843b4673d8588d57c5a54d498eed16cae9803c12a9fcfbe12b981101fca83c4"} Apr 22 14:26:13.615885 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.615726 2579 scope.go:117] "RemoveContainer" containerID="6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a" Apr 22 14:26:13.623875 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.623854 2579 scope.go:117] "RemoveContainer" containerID="6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a" Apr 22 14:26:13.624118 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:26:13.624100 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a\": container with ID starting with 6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a not found: ID does not exist" containerID="6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a" Apr 22 14:26:13.624164 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.624125 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a"} err="failed to get container status \"6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a\": rpc error: code = NotFound desc = could not find container \"6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a\": container with ID starting with 6c92b11c3a8fa0f15a364475c664d3e6473dc81a584e12eac521919d6ae94d3a not found: ID does not exist" Apr 22 14:26:13.638130 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.638108 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh"] Apr 22 14:26:13.640159 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.640138 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfdf-predictor-7c4489dd67-rxsqh"] Apr 22 14:26:13.760200 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:13.760173 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" path="/var/lib/kubelet/pods/6569b4b3-4678-49b0-86c5-21d7a839c64f/volumes" Apr 22 14:26:18.588975 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:18.588922 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:20.463246 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:20.463219 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:26:21.609815 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:21.609776 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 14:26:23.589252 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:23.589214 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:23.589608 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:23.589319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:26:28.588675 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:28.588629 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:31.609667 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:31.609630 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 14:26:33.589013 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:33.588974 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:38.588903 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:38.588866 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:39.249849 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.249800 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl"] Apr 22 14:26:39.250322 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.250301 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" Apr 22 14:26:39.250322 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.250321 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" Apr 22 14:26:39.250490 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.250390 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6569b4b3-4678-49b0-86c5-21d7a839c64f" containerName="kserve-container" Apr 22 14:26:39.252100 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.252080 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.255648 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.255626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 22 14:26:39.256269 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.256254 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 22 14:26:39.262120 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.262096 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl"] Apr 22 14:26:39.372531 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.372502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.372675 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.372547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9ba33c-6129-461b-91bd-0459125543f1-openshift-service-ca-bundle\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.477857 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.473777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9ba33c-6129-461b-91bd-0459125543f1-openshift-service-ca-bundle\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.477857 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.474289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.477857 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:26:39.474373 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 22 14:26:39.477857 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:26:39.474446 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls podName:af9ba33c-6129-461b-91bd-0459125543f1 nodeName:}" failed. No retries permitted until 2026-04-22 14:26:39.974425459 +0000 UTC m=+650.777511410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls") pod "model-chainer-5ddf86697d-94lkl" (UID: "af9ba33c-6129-461b-91bd-0459125543f1") : secret "model-chainer-serving-cert" not found Apr 22 14:26:39.477857 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.474891 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9ba33c-6129-461b-91bd-0459125543f1-openshift-service-ca-bundle\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.595665 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.595646 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:26:39.676667 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.676638 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-openshift-service-ca-bundle\") pod \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " Apr 22 14:26:39.676851 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.676708 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-proxy-tls\") pod \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\" (UID: \"733ef929-3e43-46f7-80b0-1b8a0d2bcd13\") " Apr 22 14:26:39.677030 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.677005 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "733ef929-3e43-46f7-80b0-1b8a0d2bcd13" (UID: "733ef929-3e43-46f7-80b0-1b8a0d2bcd13"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:26:39.678736 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.678717 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "733ef929-3e43-46f7-80b0-1b8a0d2bcd13" (UID: "733ef929-3e43-46f7-80b0-1b8a0d2bcd13"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:26:39.687969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.687944 2579 generic.go:358] "Generic (PLEG): container finished" podID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerID="07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4" exitCode=0 Apr 22 14:26:39.688060 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.688002 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" Apr 22 14:26:39.688060 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.688018 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" event={"ID":"733ef929-3e43-46f7-80b0-1b8a0d2bcd13","Type":"ContainerDied","Data":"07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4"} Apr 22 14:26:39.688060 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.688055 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr" event={"ID":"733ef929-3e43-46f7-80b0-1b8a0d2bcd13","Type":"ContainerDied","Data":"7694e2d43a5b1865cd4c942bc067bd0f00f0442409ff4fc2e88b2084547a8315"} Apr 22 14:26:39.688152 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.688075 2579 scope.go:117] "RemoveContainer" containerID="07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4" Apr 22 14:26:39.695724 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.695709 2579 scope.go:117] "RemoveContainer" containerID="07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4" Apr 22 14:26:39.695987 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:26:39.695965 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4\": container with ID starting with 07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4 not found: ID does not exist" containerID="07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4" Apr 22 14:26:39.696051 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.695997 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4"} err="failed to get container status \"07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4\": rpc error: code = NotFound desc = could not find container \"07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4\": container with ID starting with 07404c710c26d4268073fff96dabc3f8a63a3f996afb93bb6efa7518d0f2e9d4 not found: ID does not exist" Apr 22 14:26:39.709518 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.709498 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr"] Apr 22 14:26:39.716344 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.716324 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1cfdf-5887c7bb65-mq9nr"] Apr 22 14:26:39.760133 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.760113 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" path="/var/lib/kubelet/pods/733ef929-3e43-46f7-80b0-1b8a0d2bcd13/volumes" Apr 22 14:26:39.777335 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.777290 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:26:39.777335 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.777309 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/733ef929-3e43-46f7-80b0-1b8a0d2bcd13-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:26:39.979534 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.979506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:39.981727 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:39.981699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls\") pod \"model-chainer-5ddf86697d-94lkl\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:40.162089 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:40.162012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:40.276225 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:40.276202 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl"] Apr 22 14:26:40.278486 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:26:40.278464 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-48948447dbb29386237d23635096cbb46ca296688a2ece6830dd7afeb3c4d3ce WatchSource:0}: Error finding container 48948447dbb29386237d23635096cbb46ca296688a2ece6830dd7afeb3c4d3ce: Status 404 returned error can't find the container with id 48948447dbb29386237d23635096cbb46ca296688a2ece6830dd7afeb3c4d3ce Apr 22 14:26:40.692773 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:40.692735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" event={"ID":"af9ba33c-6129-461b-91bd-0459125543f1","Type":"ContainerStarted","Data":"f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798"} Apr 22 14:26:40.692773 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:40.692775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" event={"ID":"af9ba33c-6129-461b-91bd-0459125543f1","Type":"ContainerStarted","Data":"48948447dbb29386237d23635096cbb46ca296688a2ece6830dd7afeb3c4d3ce"} Apr 22 14:26:40.693201 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:40.692871 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:41.609778 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:41.609743 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 14:26:46.700939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:46.700912 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:26:46.722031 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:46.721993 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podStartSLOduration=7.721981141 podStartE2EDuration="7.721981141s" podCreationTimestamp="2026-04-22 14:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:26:40.722503903 +0000 UTC m=+651.525589870" watchObservedRunningTime="2026-04-22 14:26:46.721981141 +0000 UTC m=+657.525067097" Apr 22 14:26:49.387399 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.387366 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl"] Apr 22 14:26:49.387749 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.387557 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" containerID="cri-o://f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798" gracePeriod=30 Apr 22 14:26:49.629979 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.629942 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc"] Apr 22 14:26:49.630363 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.630346 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" Apr 22 14:26:49.630450 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.630365 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" Apr 22 14:26:49.630450 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.630441 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="733ef929-3e43-46f7-80b0-1b8a0d2bcd13" containerName="switch-graph-1cfdf" Apr 22 14:26:49.632313 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.632293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" Apr 22 14:26:49.643481 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.643122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" Apr 22 14:26:49.645389 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.645361 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc"] Apr 22 14:26:49.672154 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.672125 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8"] Apr 22 14:26:49.672464 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.672438 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" containerID="cri-o://d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68" gracePeriod=30 Apr 22 14:26:49.767913 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:49.767889 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc"] Apr 22 14:26:50.462683 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:50.462646 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 14:26:50.720375 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:50.720295 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" event={"ID":"a1340150-0c1a-4d0f-ae54-df2d114c41fa","Type":"ContainerStarted","Data":"57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f"} Apr 22 14:26:50.720375 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:50.720330 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" event={"ID":"a1340150-0c1a-4d0f-ae54-df2d114c41fa","Type":"ContainerStarted","Data":"00d5eaee34a8e7e84fe2fa134fffd3b8a191fee1934130ba165e5dfc684fe778"} Apr 22 14:26:50.720550 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:50.720538 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" Apr 22 14:26:50.721679 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:50.721656 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 14:26:50.740577 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:50.740520 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podStartSLOduration=1.740502632 podStartE2EDuration="1.740502632s" podCreationTimestamp="2026-04-22 14:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:26:50.738118109 +0000 UTC m=+661.541204070" watchObservedRunningTime="2026-04-22 14:26:50.740502632 +0000 UTC m=+661.543588589" Apr 22 14:26:51.610068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:51.610024 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 14:26:51.698854 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:51.698797 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:51.724206 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:51.724172 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 14:26:53.199101 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.199074 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:26:53.302861 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.302765 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/458e541b-5615-4eef-813b-df55c095f0cf-kserve-provision-location\") pod \"458e541b-5615-4eef-813b-df55c095f0cf\" (UID: \"458e541b-5615-4eef-813b-df55c095f0cf\") " Apr 22 14:26:53.303092 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.303068 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458e541b-5615-4eef-813b-df55c095f0cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "458e541b-5615-4eef-813b-df55c095f0cf" (UID: "458e541b-5615-4eef-813b-df55c095f0cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:26:53.403717 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.403689 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/458e541b-5615-4eef-813b-df55c095f0cf-kserve-provision-location\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:26:53.731362 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.731277 2579 generic.go:358] "Generic (PLEG): container finished" podID="458e541b-5615-4eef-813b-df55c095f0cf" containerID="d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68" exitCode=0 Apr 22 14:26:53.731362 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.731351 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" Apr 22 14:26:53.731551 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.731356 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" event={"ID":"458e541b-5615-4eef-813b-df55c095f0cf","Type":"ContainerDied","Data":"d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68"} Apr 22 14:26:53.731551 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.731391 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8" event={"ID":"458e541b-5615-4eef-813b-df55c095f0cf","Type":"ContainerDied","Data":"6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e"} Apr 22 14:26:53.731551 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.731406 2579 scope.go:117] "RemoveContainer" containerID="d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68" Apr 22 14:26:53.739607 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.739589 2579 scope.go:117] "RemoveContainer" containerID="2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09" Apr 22 14:26:53.746277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.746259 2579 scope.go:117] "RemoveContainer" containerID="d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68" Apr 22 14:26:53.746507 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:26:53.746491 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68\": container with ID starting with d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68 not found: ID does not exist" containerID="d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68" Apr 22 14:26:53.746575 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.746520 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68"} err="failed to get container status \"d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68\": rpc error: code = NotFound desc = could not find container \"d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68\": container with ID starting with d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68 not found: ID does not exist" Apr 22 14:26:53.746575 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.746544 2579 scope.go:117] "RemoveContainer" containerID="2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09" Apr 22 14:26:53.746780 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:26:53.746762 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09\": container with ID starting with 2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09 not found: ID does not exist" containerID="2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09" Apr 22 14:26:53.746819 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.746786 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09"} err="failed to get container status \"2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09\": rpc error: code = NotFound desc = could not find container \"2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09\": container with ID starting with 2b1dacfa4b357fb8fe60b47556c23b72f3ad6f61bcf19b30551a3c1c0b975a09 not found: ID does not exist" Apr 22 14:26:53.754243 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.754223 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8"] Apr 22 14:26:53.760893 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:53.760877 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-s8pc8"] Apr 22 14:26:55.760725 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:55.760689 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458e541b-5615-4eef-813b-df55c095f0cf" path="/var/lib/kubelet/pods/458e541b-5615-4eef-813b-df55c095f0cf/volumes" Apr 22 14:26:56.699072 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:26:56.699034 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:01.611009 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:01.610976 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" Apr 22 14:27:01.699208 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:01.699176 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:01.699379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:01.699274 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:27:01.724638 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:01.724613 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 14:27:06.699140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:06.699093 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:11.699384 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:11.699340 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:11.725151 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:11.725124 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 14:27:16.698871 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:16.698823 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:19.003785 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.003746 2579 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/b7081daca173b870aa540c025f56c30fa7f10d9fd0d4e1a57644a0923bb09dda/diff" to get inode usage: stat /var/lib/containers/storage/overlay/b7081daca173b870aa540c025f56c30fa7f10d9fd0d4e1a57644a0923bb09dda/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/kserve-ci-e2e-test_isvc-xgboost-graph-predictor-784cb989b8-s8pc8_458e541b-5615-4eef-813b-df55c095f0cf/kserve-container/0.log" to get inode usage: stat /var/log/pods/kserve-ci-e2e-test_isvc-xgboost-graph-predictor-784cb989b8-s8pc8_458e541b-5615-4eef-813b-df55c095f0cf/kserve-container/0.log: no such file or directory Apr 22 14:27:19.444562 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.444512 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-conmon-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-conmon-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:27:19.444802 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.444544 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-conmon-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-conmon-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:27:19.444897 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.444557 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-conmon-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:27:19.444961 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.444608 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-conmon-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-conmon-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-6a2891ab326a9629141ba1e041d8c9f78977a6115bacccb79ce5b9703537cb9e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:27:19.446051 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.444819 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-conmon-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e541b_5615_4eef_813b_df55c095f0cf.slice/crio-d4ec0f32619e1e4eb866d875289ed12838ccde6febb7fba151da8510fb37ca68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9ba33c_6129_461b_91bd_0459125543f1.slice/crio-conmon-f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:27:19.562140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.562116 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:27:19.589029 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.589007 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9ba33c-6129-461b-91bd-0459125543f1-openshift-service-ca-bundle\") pod \"af9ba33c-6129-461b-91bd-0459125543f1\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " Apr 22 14:27:19.589134 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.589050 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls\") pod \"af9ba33c-6129-461b-91bd-0459125543f1\" (UID: \"af9ba33c-6129-461b-91bd-0459125543f1\") " Apr 22 14:27:19.589316 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.589296 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af9ba33c-6129-461b-91bd-0459125543f1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "af9ba33c-6129-461b-91bd-0459125543f1" (UID: "af9ba33c-6129-461b-91bd-0459125543f1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:27:19.590841 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.590812 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "af9ba33c-6129-461b-91bd-0459125543f1" (UID: "af9ba33c-6129-461b-91bd-0459125543f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:27:19.690230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.690169 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9ba33c-6129-461b-91bd-0459125543f1-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:27:19.690230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.690191 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af9ba33c-6129-461b-91bd-0459125543f1-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:27:19.752803 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.752769 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5"] Apr 22 14:27:19.753136 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753123 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" Apr 22 14:27:19.753180 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753138 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" Apr 22 14:27:19.753180 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753147 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="storage-initializer" Apr 22 14:27:19.753180 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753152 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="storage-initializer" Apr 22 14:27:19.753180 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753161 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" Apr 22 14:27:19.753180 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753167 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" Apr 22 14:27:19.753319 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753214 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="458e541b-5615-4eef-813b-df55c095f0cf" containerName="kserve-container" Apr 22 14:27:19.753319 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.753222 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af9ba33c-6129-461b-91bd-0459125543f1" containerName="model-chainer" Apr 22 14:27:19.755134 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.755119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:19.758095 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.758072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-58a83-serving-cert\"" Apr 22 14:27:19.758220 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.758101 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-58a83-kube-rbac-proxy-sar-config\"" Apr 22 14:27:19.770350 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.770333 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5"] Apr 22 14:27:19.791027 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.791001 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e233ceb-a8a0-41b3-b26a-781b386f0396-openshift-service-ca-bundle\") pod \"switch-graph-58a83-5fcb457996-rbdg5\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:19.791126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.791029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e233ceb-a8a0-41b3-b26a-781b386f0396-proxy-tls\") pod \"switch-graph-58a83-5fcb457996-rbdg5\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:19.802646 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.802623 2579 generic.go:358] "Generic (PLEG): container finished" podID="af9ba33c-6129-461b-91bd-0459125543f1" containerID="f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798" exitCode=0 Apr 22 14:27:19.802718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.802670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" event={"ID":"af9ba33c-6129-461b-91bd-0459125543f1","Type":"ContainerDied","Data":"f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798"} Apr 22 14:27:19.802718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.802700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" event={"ID":"af9ba33c-6129-461b-91bd-0459125543f1","Type":"ContainerDied","Data":"48948447dbb29386237d23635096cbb46ca296688a2ece6830dd7afeb3c4d3ce"} Apr 22 14:27:19.802718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.802716 2579 scope.go:117] "RemoveContainer" containerID="f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798" Apr 22 14:27:19.802816 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.802677 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl" Apr 22 14:27:19.810670 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.810657 2579 scope.go:117] "RemoveContainer" containerID="f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798" Apr 22 14:27:19.810899 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:27:19.810881 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798\": container with ID starting with f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798 not found: ID does not exist" containerID="f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798" Apr 22 14:27:19.810960 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.810909 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798"} err="failed to get container status \"f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798\": rpc error: code = NotFound desc = could not find container \"f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798\": container with ID starting with f037620efb3e3ed12aabda8ed533b70865ed6c4ec270f50fdda4a0d9d655d798 not found: ID does not exist" Apr 22 14:27:19.823830 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.823806 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl"] Apr 22 14:27:19.832608 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.832584 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-94lkl"] Apr 22 14:27:19.891572 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.891543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e233ceb-a8a0-41b3-b26a-781b386f0396-openshift-service-ca-bundle\") pod \"switch-graph-58a83-5fcb457996-rbdg5\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:19.891676 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.891577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e233ceb-a8a0-41b3-b26a-781b386f0396-proxy-tls\") pod \"switch-graph-58a83-5fcb457996-rbdg5\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:19.892270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.892249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e233ceb-a8a0-41b3-b26a-781b386f0396-openshift-service-ca-bundle\") pod \"switch-graph-58a83-5fcb457996-rbdg5\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:19.893961 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:19.893939 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e233ceb-a8a0-41b3-b26a-781b386f0396-proxy-tls\") pod \"switch-graph-58a83-5fcb457996-rbdg5\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:20.065226 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:20.065185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:20.190895 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:20.190863 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5"] Apr 22 14:27:20.194184 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:27:20.194156 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e233ceb_a8a0_41b3_b26a_781b386f0396.slice/crio-1da8518475518513d51262d3ec686048294a7a1208b2ee3646e6c2ce4bc142c4 WatchSource:0}: Error finding container 1da8518475518513d51262d3ec686048294a7a1208b2ee3646e6c2ce4bc142c4: Status 404 returned error can't find the container with id 1da8518475518513d51262d3ec686048294a7a1208b2ee3646e6c2ce4bc142c4 Apr 22 14:27:20.807898 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:20.807865 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" event={"ID":"2e233ceb-a8a0-41b3-b26a-781b386f0396","Type":"ContainerStarted","Data":"c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5"} Apr 22 14:27:20.807898 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:20.807895 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" event={"ID":"2e233ceb-a8a0-41b3-b26a-781b386f0396","Type":"ContainerStarted","Data":"1da8518475518513d51262d3ec686048294a7a1208b2ee3646e6c2ce4bc142c4"} Apr 22 14:27:20.808146 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:20.807980 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:20.826736 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:20.826686 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podStartSLOduration=1.826674245 podStartE2EDuration="1.826674245s" podCreationTimestamp="2026-04-22 14:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:27:20.825352074 +0000 UTC m=+691.628438042" watchObservedRunningTime="2026-04-22 14:27:20.826674245 +0000 UTC m=+691.629760243" Apr 22 14:27:21.724321 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:21.724284 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 14:27:21.761504 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:21.761476 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9ba33c-6129-461b-91bd-0459125543f1" path="/var/lib/kubelet/pods/af9ba33c-6129-461b-91bd-0459125543f1/volumes" Apr 22 14:27:26.815970 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:26.815932 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:27:31.724630 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:31.724581 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 14:27:41.726085 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:41.726004 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" Apr 22 14:27:59.574868 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.574816 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt"] Apr 22 14:27:59.577900 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.577885 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.581140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.581118 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1091a-kube-rbac-proxy-sar-config\"" Apr 22 14:27:59.581140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.581131 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1091a-serving-cert\"" Apr 22 14:27:59.586559 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.586540 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt"] Apr 22 14:27:59.684658 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.684631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-openshift-service-ca-bundle\") pod \"sequence-graph-1091a-65cf6f8f76-c76kt\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.684798 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.684728 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-proxy-tls\") pod \"sequence-graph-1091a-65cf6f8f76-c76kt\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.785432 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.785407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-proxy-tls\") pod \"sequence-graph-1091a-65cf6f8f76-c76kt\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.785569 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.785444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-openshift-service-ca-bundle\") pod \"sequence-graph-1091a-65cf6f8f76-c76kt\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.786082 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.786065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-openshift-service-ca-bundle\") pod \"sequence-graph-1091a-65cf6f8f76-c76kt\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.787700 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.787682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-proxy-tls\") pod \"sequence-graph-1091a-65cf6f8f76-c76kt\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:27:59.888126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:27:59.888047 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:28:00.005622 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:00.005594 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt"] Apr 22 14:28:00.008823 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:28:00.008796 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a61ddc_8d9b_4919_8f5c_6cdfa73cb8d5.slice/crio-79b6b44d50ed6de3ba99090d26cbda3a0bd51c9c8242e2162244f1c21e9c2ec7 WatchSource:0}: Error finding container 79b6b44d50ed6de3ba99090d26cbda3a0bd51c9c8242e2162244f1c21e9c2ec7: Status 404 returned error can't find the container with id 79b6b44d50ed6de3ba99090d26cbda3a0bd51c9c8242e2162244f1c21e9c2ec7 Apr 22 14:28:00.010437 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:00.010422 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:28:00.914209 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:00.914179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" event={"ID":"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5","Type":"ContainerStarted","Data":"41b26214eb6d405defe316e23f9267f117db4870c8a2a86ef446e9ea4c7b18a9"} Apr 22 14:28:00.914209 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:00.914213 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" event={"ID":"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5","Type":"ContainerStarted","Data":"79b6b44d50ed6de3ba99090d26cbda3a0bd51c9c8242e2162244f1c21e9c2ec7"} Apr 22 14:28:00.914604 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:00.914239 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:28:00.934738 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:00.934694 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podStartSLOduration=1.934681571 podStartE2EDuration="1.934681571s" podCreationTimestamp="2026-04-22 14:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:28:00.933092772 +0000 UTC m=+731.736178740" watchObservedRunningTime="2026-04-22 14:28:00.934681571 +0000 UTC m=+731.737767528" Apr 22 14:28:06.922498 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:28:06.922464 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:35:34.388585 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.388553 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5"] Apr 22 14:35:34.391098 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.388784 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" containerID="cri-o://c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5" gracePeriod=30 Apr 22 14:35:34.604126 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.604094 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t"] Apr 22 14:35:34.604334 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.604314 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" containerID="cri-o://ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416" gracePeriod=30 Apr 22 14:35:34.725072 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.724987 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x"] Apr 22 14:35:34.728282 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.728259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" Apr 22 14:35:34.736414 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.736392 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x"] Apr 22 14:35:34.737895 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.737878 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" Apr 22 14:35:34.857179 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.857155 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x"] Apr 22 14:35:34.859119 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:35:34.859094 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158c988e_59d6_4cda_8621_d2656f8e5850.slice/crio-f2d931f676797c81a68f15cd2ed66edbda914f26d4b15174d10e4f0f497ac964 WatchSource:0}: Error finding container f2d931f676797c81a68f15cd2ed66edbda914f26d4b15174d10e4f0f497ac964: Status 404 returned error can't find the container with id f2d931f676797c81a68f15cd2ed66edbda914f26d4b15174d10e4f0f497ac964 Apr 22 14:35:34.860693 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:34.860678 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:35:35.198939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:35.198898 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" event={"ID":"158c988e-59d6-4cda-8621-d2656f8e5850","Type":"ContainerStarted","Data":"0e590ac4dbaf0691293d4cbba775f264668bf1906a3057f377ebd2b3da7e4e30"} Apr 22 14:35:35.198939 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:35.198943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" event={"ID":"158c988e-59d6-4cda-8621-d2656f8e5850","Type":"ContainerStarted","Data":"f2d931f676797c81a68f15cd2ed66edbda914f26d4b15174d10e4f0f497ac964"} Apr 22 14:35:35.199174 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:35.199154 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" Apr 22 14:35:35.200377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:35.200357 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:35:35.215090 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:35.215051 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podStartSLOduration=1.215039048 podStartE2EDuration="1.215039048s" podCreationTimestamp="2026-04-22 14:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:35:35.21357326 +0000 UTC m=+1186.016659216" watchObservedRunningTime="2026-04-22 14:35:35.215039048 +0000 UTC m=+1186.018124983" Apr 22 14:35:36.202356 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:36.202315 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:35:36.814169 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:36.814133 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:35:37.533337 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:37.533314 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" Apr 22 14:35:38.208632 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.208544 2579 generic.go:358] "Generic (PLEG): container finished" podID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerID="ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416" exitCode=0 Apr 22 14:35:38.208632 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.208612 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" event={"ID":"9c12b754-7e62-456f-bb3d-f1d1940235c9","Type":"ContainerDied","Data":"ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416"} Apr 22 14:35:38.208787 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.208637 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" Apr 22 14:35:38.208787 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.208657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t" event={"ID":"9c12b754-7e62-456f-bb3d-f1d1940235c9","Type":"ContainerDied","Data":"3f3d448ef3985c1430ae2b224b44f199653df1e25f7202da71dd192ac72a6b17"} Apr 22 14:35:38.208787 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.208679 2579 scope.go:117] "RemoveContainer" containerID="ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416" Apr 22 14:35:38.216335 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.216315 2579 scope.go:117] "RemoveContainer" containerID="ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416" Apr 22 14:35:38.216567 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:35:38.216547 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416\": container with ID starting with ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416 not found: ID does not exist" containerID="ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416" Apr 22 14:35:38.216611 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.216577 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416"} err="failed to get container status \"ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416\": rpc error: code = NotFound desc = could not find container \"ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416\": container with ID starting with ad8a2fb1f5e9614f8ad83eb915b8c6dc138f6ee606d63f92ddc5ff7008f53416 not found: ID does not exist" Apr 22 14:35:38.225250 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.225228 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t"] Apr 22 14:35:38.229095 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:38.229074 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-58a83-predictor-6b4556c5b5-x4q2t"] Apr 22 14:35:39.761308 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:39.761273 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" path="/var/lib/kubelet/pods/9c12b754-7e62-456f-bb3d-f1d1940235c9/volumes" Apr 22 14:35:41.814567 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:41.814535 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:35:46.202437 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:46.202395 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:35:46.814290 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:46.814251 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:35:46.814455 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:46.814346 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:35:51.814668 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:51.814634 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:35:56.203352 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:56.203309 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:35:56.813763 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:35:56.813725 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:01.814306 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:01.814271 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:04.524513 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.524493 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:36:04.617947 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.617925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e233ceb-a8a0-41b3-b26a-781b386f0396-proxy-tls\") pod \"2e233ceb-a8a0-41b3-b26a-781b386f0396\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " Apr 22 14:36:04.618045 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.617966 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e233ceb-a8a0-41b3-b26a-781b386f0396-openshift-service-ca-bundle\") pod \"2e233ceb-a8a0-41b3-b26a-781b386f0396\" (UID: \"2e233ceb-a8a0-41b3-b26a-781b386f0396\") " Apr 22 14:36:04.618307 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.618278 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e233ceb-a8a0-41b3-b26a-781b386f0396-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2e233ceb-a8a0-41b3-b26a-781b386f0396" (UID: "2e233ceb-a8a0-41b3-b26a-781b386f0396"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:36:04.619786 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.619768 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e233ceb-a8a0-41b3-b26a-781b386f0396-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e233ceb-a8a0-41b3-b26a-781b386f0396" (UID: "2e233ceb-a8a0-41b3-b26a-781b386f0396"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:36:04.718706 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.718658 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e233ceb-a8a0-41b3-b26a-781b386f0396-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:36:04.718706 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:04.718680 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e233ceb-a8a0-41b3-b26a-781b386f0396-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:36:05.290441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.290410 2579 generic.go:358] "Generic (PLEG): container finished" podID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerID="c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5" exitCode=0 Apr 22 14:36:05.290580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.290473 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" Apr 22 14:36:05.290580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.290497 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" event={"ID":"2e233ceb-a8a0-41b3-b26a-781b386f0396","Type":"ContainerDied","Data":"c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5"} Apr 22 14:36:05.290580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.290535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5" event={"ID":"2e233ceb-a8a0-41b3-b26a-781b386f0396","Type":"ContainerDied","Data":"1da8518475518513d51262d3ec686048294a7a1208b2ee3646e6c2ce4bc142c4"} Apr 22 14:36:05.290580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.290552 2579 scope.go:117] "RemoveContainer" containerID="c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5" Apr 22 14:36:05.298676 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.298653 2579 scope.go:117] "RemoveContainer" containerID="c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5" Apr 22 14:36:05.298945 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:36:05.298922 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5\": container with ID starting with c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5 not found: ID does not exist" containerID="c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5" Apr 22 14:36:05.299025 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.298956 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5"} err="failed to get container status \"c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5\": rpc error: code = NotFound desc = could not find container \"c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5\": container with ID starting with c88c842fcf8393c62b0df81d06cc5212119931af1788adde0f5cab460a1d43f5 not found: ID does not exist" Apr 22 14:36:05.312114 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.312094 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5"] Apr 22 14:36:05.317873 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.317851 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-58a83-5fcb457996-rbdg5"] Apr 22 14:36:05.760072 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:05.760037 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" path="/var/lib/kubelet/pods/2e233ceb-a8a0-41b3-b26a-781b386f0396/volumes" Apr 22 14:36:06.203444 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:06.203364 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:36:14.284881 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.284851 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt"] Apr 22 14:36:14.285309 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.285068 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" containerID="cri-o://41b26214eb6d405defe316e23f9267f117db4870c8a2a86ef446e9ea4c7b18a9" gracePeriod=30 Apr 22 14:36:14.347169 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.347142 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc"] Apr 22 14:36:14.347385 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.347365 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" containerID="cri-o://57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f" gracePeriod=30 Apr 22 14:36:14.532436 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532401 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th"] Apr 22 14:36:14.532768 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532752 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" Apr 22 14:36:14.532872 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532771 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" Apr 22 14:36:14.532872 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532802 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" Apr 22 14:36:14.532872 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532811 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" Apr 22 14:36:14.533025 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532916 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e233ceb-a8a0-41b3-b26a-781b386f0396" containerName="switch-graph-58a83" Apr 22 14:36:14.533025 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.532929 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c12b754-7e62-456f-bb3d-f1d1940235c9" containerName="kserve-container" Apr 22 14:36:14.535714 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.535665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" Apr 22 14:36:14.544630 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.544608 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th"] Apr 22 14:36:14.546989 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.546973 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" Apr 22 14:36:14.665237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:14.665213 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th"] Apr 22 14:36:14.667842 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:36:14.667805 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0594a2e1_b605_4fc8_888a_9f53129071bf.slice/crio-3b854f0a673ee6f04128eff9d1ac46db1f7afa39e69d0d8f7b55902b4a02dc5d WatchSource:0}: Error finding container 3b854f0a673ee6f04128eff9d1ac46db1f7afa39e69d0d8f7b55902b4a02dc5d: Status 404 returned error can't find the container with id 3b854f0a673ee6f04128eff9d1ac46db1f7afa39e69d0d8f7b55902b4a02dc5d Apr 22 14:36:15.321219 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:15.321189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" event={"ID":"0594a2e1-b605-4fc8-888a-9f53129071bf","Type":"ContainerStarted","Data":"84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c"} Apr 22 14:36:15.321219 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:15.321228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" event={"ID":"0594a2e1-b605-4fc8-888a-9f53129071bf","Type":"ContainerStarted","Data":"3b854f0a673ee6f04128eff9d1ac46db1f7afa39e69d0d8f7b55902b4a02dc5d"} Apr 22 14:36:15.321693 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:15.321348 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" Apr 22 14:36:15.322557 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:15.322530 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:36:15.338810 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:15.338767 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podStartSLOduration=1.3387560459999999 podStartE2EDuration="1.338756046s" podCreationTimestamp="2026-04-22 14:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:15.337401094 +0000 UTC m=+1226.140487052" watchObservedRunningTime="2026-04-22 14:36:15.338756046 +0000 UTC m=+1226.141842002" Apr 22 14:36:16.202609 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:16.202568 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:36:16.325070 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:16.325034 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:36:16.921514 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:16.921474 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:17.679175 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:17.679155 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" Apr 22 14:36:18.332400 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.332371 2579 generic.go:358] "Generic (PLEG): container finished" podID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerID="57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f" exitCode=0 Apr 22 14:36:18.332570 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.332427 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" Apr 22 14:36:18.332570 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.332440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" event={"ID":"a1340150-0c1a-4d0f-ae54-df2d114c41fa","Type":"ContainerDied","Data":"57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f"} Apr 22 14:36:18.332570 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.332466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc" event={"ID":"a1340150-0c1a-4d0f-ae54-df2d114c41fa","Type":"ContainerDied","Data":"00d5eaee34a8e7e84fe2fa134fffd3b8a191fee1934130ba165e5dfc684fe778"} Apr 22 14:36:18.332570 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.332482 2579 scope.go:117] "RemoveContainer" containerID="57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f" Apr 22 14:36:18.340066 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.340034 2579 scope.go:117] "RemoveContainer" containerID="57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f" Apr 22 14:36:18.340356 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:36:18.340334 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f\": container with ID starting with 57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f not found: ID does not exist" containerID="57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f" Apr 22 14:36:18.340488 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.340367 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f"} err="failed to get container status \"57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f\": rpc error: code = NotFound desc = could not find container \"57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f\": container with ID starting with 57b3ade004e27928eac6850cb32ca9a49c4baea48a905f80946a91da149cce3f not found: ID does not exist" Apr 22 14:36:18.356568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.356545 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc"] Apr 22 14:36:18.360100 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:18.360079 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1091a-predictor-7f48745f5-pmhpc"] Apr 22 14:36:19.760290 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:19.760257 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" path="/var/lib/kubelet/pods/a1340150-0c1a-4d0f-ae54-df2d114c41fa/volumes" Apr 22 14:36:21.920961 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:21.920922 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:26.203503 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:26.203469 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" Apr 22 14:36:26.326079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:26.326044 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:36:26.920960 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:26.920923 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:26.921150 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:26.921028 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:36:31.921579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:31.921538 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:36.325374 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:36.325289 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:36:36.920981 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:36.920939 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:41.921150 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:41.921114 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:44.408269 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.408239 2579 generic.go:358] "Generic (PLEG): container finished" podID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerID="41b26214eb6d405defe316e23f9267f117db4870c8a2a86ef446e9ea4c7b18a9" exitCode=0 Apr 22 14:36:44.408550 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.408313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" event={"ID":"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5","Type":"ContainerDied","Data":"41b26214eb6d405defe316e23f9267f117db4870c8a2a86ef446e9ea4c7b18a9"} Apr 22 14:36:44.450101 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.450082 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:36:44.592944 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.592883 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-openshift-service-ca-bundle\") pod \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " Apr 22 14:36:44.593058 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.592969 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-proxy-tls\") pod \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\" (UID: \"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5\") " Apr 22 14:36:44.593204 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.593180 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" (UID: "38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:36:44.594936 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.594920 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" (UID: "38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:36:44.652779 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.652757 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb"] Apr 22 14:36:44.653110 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.653097 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" Apr 22 14:36:44.653110 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.653111 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" Apr 22 14:36:44.653189 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.653119 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" Apr 22 14:36:44.653189 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.653125 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" Apr 22 14:36:44.653189 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.653185 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1340150-0c1a-4d0f-ae54-df2d114c41fa" containerName="kserve-container" Apr 22 14:36:44.653277 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.653193 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" containerName="sequence-graph-1091a" Apr 22 14:36:44.656204 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.656191 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:44.658953 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.658935 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-dee6f-kube-rbac-proxy-sar-config\"" Apr 22 14:36:44.659052 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.658936 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-dee6f-serving-cert\"" Apr 22 14:36:44.666697 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.666680 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb"] Apr 22 14:36:44.693585 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.693567 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:36:44.693671 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.693588 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:36:44.794052 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.794026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8685743d-a69d-4c41-88b8-2459f83740a4-openshift-service-ca-bundle\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:44.794131 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.794075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:44.894783 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.894728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:44.894905 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.894797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8685743d-a69d-4c41-88b8-2459f83740a4-openshift-service-ca-bundle\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:44.894967 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:36:44.894909 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-dee6f-serving-cert: secret "ensemble-graph-dee6f-serving-cert" not found Apr 22 14:36:44.895017 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:36:44.894996 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls podName:8685743d-a69d-4c41-88b8-2459f83740a4 nodeName:}" failed. No retries permitted until 2026-04-22 14:36:45.394973237 +0000 UTC m=+1256.198059192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls") pod "ensemble-graph-dee6f-547898ffd6-nngnb" (UID: "8685743d-a69d-4c41-88b8-2459f83740a4") : secret "ensemble-graph-dee6f-serving-cert" not found Apr 22 14:36:44.895392 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:44.895373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8685743d-a69d-4c41-88b8-2459f83740a4-openshift-service-ca-bundle\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:45.399520 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.399494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:45.401759 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.401733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls\") pod \"ensemble-graph-dee6f-547898ffd6-nngnb\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:45.412580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.412562 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" Apr 22 14:36:45.412580 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.412575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt" event={"ID":"38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5","Type":"ContainerDied","Data":"79b6b44d50ed6de3ba99090d26cbda3a0bd51c9c8242e2162244f1c21e9c2ec7"} Apr 22 14:36:45.413043 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.412609 2579 scope.go:117] "RemoveContainer" containerID="41b26214eb6d405defe316e23f9267f117db4870c8a2a86ef446e9ea4c7b18a9" Apr 22 14:36:45.436492 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.436468 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt"] Apr 22 14:36:45.439139 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.439120 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1091a-65cf6f8f76-c76kt"] Apr 22 14:36:45.565431 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.565414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:45.678374 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.678350 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb"] Apr 22 14:36:45.680539 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:36:45.680504 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8685743d_a69d_4c41_88b8_2459f83740a4.slice/crio-44f6a69f18b6106088511cea37a3becac75330d83e034fae9a5424d8feb8ebc4 WatchSource:0}: Error finding container 44f6a69f18b6106088511cea37a3becac75330d83e034fae9a5424d8feb8ebc4: Status 404 returned error can't find the container with id 44f6a69f18b6106088511cea37a3becac75330d83e034fae9a5424d8feb8ebc4 Apr 22 14:36:45.761358 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:45.761328 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5" path="/var/lib/kubelet/pods/38a61ddc-8d9b-4919-8f5c-6cdfa73cb8d5/volumes" Apr 22 14:36:46.325649 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:46.325609 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:36:46.417296 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:46.417267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" event={"ID":"8685743d-a69d-4c41-88b8-2459f83740a4","Type":"ContainerStarted","Data":"540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352"} Apr 22 14:36:46.417618 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:46.417300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" event={"ID":"8685743d-a69d-4c41-88b8-2459f83740a4","Type":"ContainerStarted","Data":"44f6a69f18b6106088511cea37a3becac75330d83e034fae9a5424d8feb8ebc4"} Apr 22 14:36:46.417618 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:46.417438 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:46.435471 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:46.435420 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podStartSLOduration=2.435403861 podStartE2EDuration="2.435403861s" podCreationTimestamp="2026-04-22 14:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:46.433953906 +0000 UTC m=+1257.237039876" watchObservedRunningTime="2026-04-22 14:36:46.435403861 +0000 UTC m=+1257.238489819" Apr 22 14:36:52.425892 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:52.425864 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:36:54.737870 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.737820 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb"] Apr 22 14:36:54.738375 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.738036 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" containerID="cri-o://540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352" gracePeriod=30 Apr 22 14:36:54.903209 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.903180 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x"] Apr 22 14:36:54.903473 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.903448 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" containerID="cri-o://0e590ac4dbaf0691293d4cbba775f264668bf1906a3057f377ebd2b3da7e4e30" gracePeriod=30 Apr 22 14:36:54.925015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.924988 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp"] Apr 22 14:36:54.929562 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.929548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" Apr 22 14:36:54.934721 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.934698 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp"] Apr 22 14:36:54.939802 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:54.939786 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" Apr 22 14:36:55.055702 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:55.055568 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp"] Apr 22 14:36:55.058539 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:36:55.058511 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29239c9_642b_4958_ae16_a2649264b49c.slice/crio-14447fcc09e05e0dc5e35b7611f88fc015ae7662dd2cef39e8582c6212ce3e65 WatchSource:0}: Error finding container 14447fcc09e05e0dc5e35b7611f88fc015ae7662dd2cef39e8582c6212ce3e65: Status 404 returned error can't find the container with id 14447fcc09e05e0dc5e35b7611f88fc015ae7662dd2cef39e8582c6212ce3e65 Apr 22 14:36:55.445560 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:55.445469 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" event={"ID":"b29239c9-642b-4958-ae16-a2649264b49c","Type":"ContainerStarted","Data":"41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d"} Apr 22 14:36:55.445560 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:55.445515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" event={"ID":"b29239c9-642b-4958-ae16-a2649264b49c","Type":"ContainerStarted","Data":"14447fcc09e05e0dc5e35b7611f88fc015ae7662dd2cef39e8582c6212ce3e65"} Apr 22 14:36:55.445760 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:55.445718 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" Apr 22 14:36:55.446947 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:55.446917 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 14:36:55.461641 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:55.461601 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podStartSLOduration=1.4615881800000001 podStartE2EDuration="1.46158818s" podCreationTimestamp="2026-04-22 14:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:55.459786599 +0000 UTC m=+1266.262872557" watchObservedRunningTime="2026-04-22 14:36:55.46158818 +0000 UTC m=+1266.264674136" Apr 22 14:36:56.203028 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:56.202989 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 14:36:56.325608 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:56.325571 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:36:56.449887 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:56.449853 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 14:36:57.425270 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:57.425236 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:36:57.454862 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:57.454819 2579 generic.go:358] "Generic (PLEG): container finished" podID="158c988e-59d6-4cda-8621-d2656f8e5850" containerID="0e590ac4dbaf0691293d4cbba775f264668bf1906a3057f377ebd2b3da7e4e30" exitCode=0 Apr 22 14:36:57.455010 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:57.454879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" event={"ID":"158c988e-59d6-4cda-8621-d2656f8e5850","Type":"ContainerDied","Data":"0e590ac4dbaf0691293d4cbba775f264668bf1906a3057f377ebd2b3da7e4e30"} Apr 22 14:36:57.532043 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:57.532024 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" Apr 22 14:36:58.459764 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:58.459742 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" Apr 22 14:36:58.460193 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:58.459740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x" event={"ID":"158c988e-59d6-4cda-8621-d2656f8e5850","Type":"ContainerDied","Data":"f2d931f676797c81a68f15cd2ed66edbda914f26d4b15174d10e4f0f497ac964"} Apr 22 14:36:58.460193 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:58.459866 2579 scope.go:117] "RemoveContainer" containerID="0e590ac4dbaf0691293d4cbba775f264668bf1906a3057f377ebd2b3da7e4e30" Apr 22 14:36:58.476301 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:58.476275 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x"] Apr 22 14:36:58.481209 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:58.481190 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dee6f-predictor-59f85ff5f-4799x"] Apr 22 14:36:59.761467 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:36:59.761438 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" path="/var/lib/kubelet/pods/158c988e-59d6-4cda-8621-d2656f8e5850/volumes" Apr 22 14:37:02.424454 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:02.424407 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:06.326002 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:06.325973 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" Apr 22 14:37:06.450438 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:06.450393 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 14:37:07.425240 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:07.425200 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:07.425597 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:07.425310 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:37:12.424934 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:12.424899 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:16.450569 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:16.450529 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 14:37:17.424803 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:17.424765 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:22.425362 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:22.425317 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:24.466790 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.466753 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths"] Apr 22 14:37:24.467153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.467083 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" Apr 22 14:37:24.467153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.467094 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" Apr 22 14:37:24.467228 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.467157 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="158c988e-59d6-4cda-8621-d2656f8e5850" containerName="kserve-container" Apr 22 14:37:24.469989 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.469973 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.472927 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.472906 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-33e8e-kube-rbac-proxy-sar-config\"" Apr 22 14:37:24.473032 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.472924 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-33e8e-serving-cert\"" Apr 22 14:37:24.483609 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.483586 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths"] Apr 22 14:37:24.594117 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.594091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a41e80-8086-468e-a98d-7444e1af730a-openshift-service-ca-bundle\") pod \"sequence-graph-33e8e-5c55857f76-xpths\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.594256 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.594129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96a41e80-8086-468e-a98d-7444e1af730a-proxy-tls\") pod \"sequence-graph-33e8e-5c55857f76-xpths\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.695116 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.695083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a41e80-8086-468e-a98d-7444e1af730a-openshift-service-ca-bundle\") pod \"sequence-graph-33e8e-5c55857f76-xpths\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.695265 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.695127 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96a41e80-8086-468e-a98d-7444e1af730a-proxy-tls\") pod \"sequence-graph-33e8e-5c55857f76-xpths\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.695805 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.695779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a41e80-8086-468e-a98d-7444e1af730a-openshift-service-ca-bundle\") pod \"sequence-graph-33e8e-5c55857f76-xpths\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.697371 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.697354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96a41e80-8086-468e-a98d-7444e1af730a-proxy-tls\") pod \"sequence-graph-33e8e-5c55857f76-xpths\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.779564 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.779545 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:24.895140 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:24.895115 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths"] Apr 22 14:37:24.897408 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:37:24.897382 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-13b20258401026161302fc16b04b441694996d652990b02c76e50a95fe5107d9 WatchSource:0}: Error finding container 13b20258401026161302fc16b04b441694996d652990b02c76e50a95fe5107d9: Status 404 returned error can't find the container with id 13b20258401026161302fc16b04b441694996d652990b02c76e50a95fe5107d9 Apr 22 14:37:25.353304 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.353282 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:37:25.501283 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.501251 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls\") pod \"8685743d-a69d-4c41-88b8-2459f83740a4\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " Apr 22 14:37:25.501680 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.501357 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8685743d-a69d-4c41-88b8-2459f83740a4-openshift-service-ca-bundle\") pod \"8685743d-a69d-4c41-88b8-2459f83740a4\" (UID: \"8685743d-a69d-4c41-88b8-2459f83740a4\") " Apr 22 14:37:25.501680 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.501660 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8685743d-a69d-4c41-88b8-2459f83740a4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8685743d-a69d-4c41-88b8-2459f83740a4" (UID: "8685743d-a69d-4c41-88b8-2459f83740a4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:37:25.503204 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.503162 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8685743d-a69d-4c41-88b8-2459f83740a4" (UID: "8685743d-a69d-4c41-88b8-2459f83740a4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:37:25.536894 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.536858 2579 generic.go:358] "Generic (PLEG): container finished" podID="8685743d-a69d-4c41-88b8-2459f83740a4" containerID="540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352" exitCode=137 Apr 22 14:37:25.537012 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.536898 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" Apr 22 14:37:25.537012 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.536930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" event={"ID":"8685743d-a69d-4c41-88b8-2459f83740a4","Type":"ContainerDied","Data":"540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352"} Apr 22 14:37:25.537012 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.536969 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb" event={"ID":"8685743d-a69d-4c41-88b8-2459f83740a4","Type":"ContainerDied","Data":"44f6a69f18b6106088511cea37a3becac75330d83e034fae9a5424d8feb8ebc4"} Apr 22 14:37:25.537012 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.536985 2579 scope.go:117] "RemoveContainer" containerID="540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352" Apr 22 14:37:25.538468 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.538446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" event={"ID":"96a41e80-8086-468e-a98d-7444e1af730a","Type":"ContainerStarted","Data":"7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0"} Apr 22 14:37:25.538571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.538477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" event={"ID":"96a41e80-8086-468e-a98d-7444e1af730a","Type":"ContainerStarted","Data":"13b20258401026161302fc16b04b441694996d652990b02c76e50a95fe5107d9"} Apr 22 14:37:25.538624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.538582 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:25.545584 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.545570 2579 scope.go:117] "RemoveContainer" containerID="540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352" Apr 22 14:37:25.545804 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:37:25.545786 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352\": container with ID starting with 540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352 not found: ID does not exist" containerID="540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352" Apr 22 14:37:25.545870 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.545812 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352"} err="failed to get container status \"540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352\": rpc error: code = NotFound desc = could not find container \"540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352\": container with ID starting with 540e90f7997c2db0dfbfdedc2a0f83703c6550b02939c9ed9a700d005e0c5352 not found: ID does not exist" Apr 22 14:37:25.569465 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.569426 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podStartSLOduration=1.569415764 podStartE2EDuration="1.569415764s" podCreationTimestamp="2026-04-22 14:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:37:25.557247667 +0000 UTC m=+1296.360333635" watchObservedRunningTime="2026-04-22 14:37:25.569415764 +0000 UTC m=+1296.372501721" Apr 22 14:37:25.570589 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.570574 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb"] Apr 22 14:37:25.578566 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.578546 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dee6f-547898ffd6-nngnb"] Apr 22 14:37:25.602573 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.602553 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8685743d-a69d-4c41-88b8-2459f83740a4-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:37:25.602573 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.602571 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8685743d-a69d-4c41-88b8-2459f83740a4-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:37:25.760943 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:25.760872 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" path="/var/lib/kubelet/pods/8685743d-a69d-4c41-88b8-2459f83740a4/volumes" Apr 22 14:37:26.450377 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:26.450336 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 14:37:31.547906 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:31.547875 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:34.544407 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.544380 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths"] Apr 22 14:37:34.544767 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.544564 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" containerID="cri-o://7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0" gracePeriod=30 Apr 22 14:37:34.702134 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.702105 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th"] Apr 22 14:37:34.702362 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.702326 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" containerID="cri-o://84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c" gracePeriod=30 Apr 22 14:37:34.741721 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.741688 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b"] Apr 22 14:37:34.742037 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.742025 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" Apr 22 14:37:34.742086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.742039 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" Apr 22 14:37:34.742086 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.742079 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8685743d-a69d-4c41-88b8-2459f83740a4" containerName="ensemble-graph-dee6f" Apr 22 14:37:34.748279 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.748258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" Apr 22 14:37:34.753632 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.753607 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b"] Apr 22 14:37:34.758195 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.758175 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" Apr 22 14:37:34.870825 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:34.870802 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b"] Apr 22 14:37:34.873181 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:37:34.873153 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode386a23c_decf_4288_90e5_d43c0090ea28.slice/crio-27a72cf32e2fd7aac216254175c0ef0e6b51294baeeed1a8243e0bf1e4fe0156 WatchSource:0}: Error finding container 27a72cf32e2fd7aac216254175c0ef0e6b51294baeeed1a8243e0bf1e4fe0156: Status 404 returned error can't find the container with id 27a72cf32e2fd7aac216254175c0ef0e6b51294baeeed1a8243e0bf1e4fe0156 Apr 22 14:37:35.570571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:35.570534 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" event={"ID":"e386a23c-decf-4288-90e5-d43c0090ea28","Type":"ContainerStarted","Data":"14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110"} Apr 22 14:37:35.571007 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:35.570575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" event={"ID":"e386a23c-decf-4288-90e5-d43c0090ea28","Type":"ContainerStarted","Data":"27a72cf32e2fd7aac216254175c0ef0e6b51294baeeed1a8243e0bf1e4fe0156"} Apr 22 14:37:35.571007 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:35.570713 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" Apr 22 14:37:35.572007 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:35.571985 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 14:37:35.586962 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:35.586915 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podStartSLOduration=1.5869017250000002 podStartE2EDuration="1.586901725s" podCreationTimestamp="2026-04-22 14:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:37:35.585569784 +0000 UTC m=+1306.388655741" watchObservedRunningTime="2026-04-22 14:37:35.586901725 +0000 UTC m=+1306.389987678" Apr 22 14:37:36.325701 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:36.325658 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 14:37:36.449949 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:36.449913 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 14:37:36.546876 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:36.546825 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:36.574022 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:36.573987 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 14:37:38.246205 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.246184 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" Apr 22 14:37:38.579591 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.579558 2579 generic.go:358] "Generic (PLEG): container finished" podID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerID="84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c" exitCode=0 Apr 22 14:37:38.579719 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.579600 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" event={"ID":"0594a2e1-b605-4fc8-888a-9f53129071bf","Type":"ContainerDied","Data":"84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c"} Apr 22 14:37:38.579719 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.579622 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" Apr 22 14:37:38.579719 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.579630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th" event={"ID":"0594a2e1-b605-4fc8-888a-9f53129071bf","Type":"ContainerDied","Data":"3b854f0a673ee6f04128eff9d1ac46db1f7afa39e69d0d8f7b55902b4a02dc5d"} Apr 22 14:37:38.579719 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.579649 2579 scope.go:117] "RemoveContainer" containerID="84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c" Apr 22 14:37:38.588052 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.588033 2579 scope.go:117] "RemoveContainer" containerID="84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c" Apr 22 14:37:38.588317 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:37:38.588297 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c\": container with ID starting with 84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c not found: ID does not exist" containerID="84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c" Apr 22 14:37:38.588378 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.588325 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c"} err="failed to get container status \"84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c\": rpc error: code = NotFound desc = could not find container \"84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c\": container with ID starting with 84350f309aa3597473aeb6e35266cefa3138b6e222acaed371a9b4b0eba49d6c not found: ID does not exist" Apr 22 14:37:38.599826 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.599806 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th"] Apr 22 14:37:38.605217 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:38.605198 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-33e8e-predictor-6697b7fc5f-7j8th"] Apr 22 14:37:39.760492 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:39.760461 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" path="/var/lib/kubelet/pods/0594a2e1-b605-4fc8-888a-9f53129071bf/volumes" Apr 22 14:37:41.546423 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:41.546380 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:46.450984 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:46.450954 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" Apr 22 14:37:46.547121 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:46.547086 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:46.547267 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:46.547176 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:37:46.574389 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:46.574355 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 14:37:51.546303 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:51.546260 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:56.546988 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:56.546952 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:37:56.574815 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:37:56.574787 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 14:38:01.546342 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:01.546306 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:38:04.565100 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:38:04.565070 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:38:04.565356 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:38:04.565100 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-conmon-7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:38:04.565356 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:38:04.565160 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-conmon-7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:38:04.565356 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:38:04.565194 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a41e80_8086_468e_a98d_7444e1af730a.slice/crio-conmon-7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:38:04.654822 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.654792 2579 generic.go:358] "Generic (PLEG): container finished" podID="96a41e80-8086-468e-a98d-7444e1af730a" containerID="7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0" exitCode=0 Apr 22 14:38:04.654944 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.654850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" event={"ID":"96a41e80-8086-468e-a98d-7444e1af730a","Type":"ContainerDied","Data":"7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0"} Apr 22 14:38:04.950603 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.950540 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c"] Apr 22 14:38:04.950871 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.950859 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" Apr 22 14:38:04.950919 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.950872 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" Apr 22 14:38:04.950953 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.950936 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0594a2e1-b605-4fc8-888a-9f53129071bf" containerName="kserve-container" Apr 22 14:38:04.953676 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.953663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:04.956645 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.956622 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-4547e-serving-cert\"" Apr 22 14:38:04.956739 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.956646 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-4547e-kube-rbac-proxy-sar-config\"" Apr 22 14:38:04.961914 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:04.961896 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c"] Apr 22 14:38:05.088476 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.088440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20949593-c6e2-4226-b094-727a339e479b-openshift-service-ca-bundle\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.088692 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.088673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.180579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.180563 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:38:05.189332 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.189312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20949593-c6e2-4226-b094-727a339e479b-openshift-service-ca-bundle\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.189437 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.189349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.189526 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:38:05.189514 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-4547e-serving-cert: secret "ensemble-graph-4547e-serving-cert" not found Apr 22 14:38:05.189594 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:38:05.189585 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls podName:20949593-c6e2-4226-b094-727a339e479b nodeName:}" failed. No retries permitted until 2026-04-22 14:38:05.68956858 +0000 UTC m=+1336.492654515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls") pod "ensemble-graph-4547e-65765f9c8-v4n9c" (UID: "20949593-c6e2-4226-b094-727a339e479b") : secret "ensemble-graph-4547e-serving-cert" not found Apr 22 14:38:05.189979 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.189956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20949593-c6e2-4226-b094-727a339e479b-openshift-service-ca-bundle\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.289600 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.289580 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96a41e80-8086-468e-a98d-7444e1af730a-proxy-tls\") pod \"96a41e80-8086-468e-a98d-7444e1af730a\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " Apr 22 14:38:05.289690 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.289668 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a41e80-8086-468e-a98d-7444e1af730a-openshift-service-ca-bundle\") pod \"96a41e80-8086-468e-a98d-7444e1af730a\" (UID: \"96a41e80-8086-468e-a98d-7444e1af730a\") " Apr 22 14:38:05.290012 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.289992 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a41e80-8086-468e-a98d-7444e1af730a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "96a41e80-8086-468e-a98d-7444e1af730a" (UID: "96a41e80-8086-468e-a98d-7444e1af730a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:38:05.291388 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.291368 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a41e80-8086-468e-a98d-7444e1af730a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "96a41e80-8086-468e-a98d-7444e1af730a" (UID: "96a41e80-8086-468e-a98d-7444e1af730a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:38:05.390381 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.390356 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a41e80-8086-468e-a98d-7444e1af730a-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:38:05.390381 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.390384 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96a41e80-8086-468e-a98d-7444e1af730a-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:38:05.659148 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.659080 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" Apr 22 14:38:05.659148 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.659094 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths" event={"ID":"96a41e80-8086-468e-a98d-7444e1af730a","Type":"ContainerDied","Data":"13b20258401026161302fc16b04b441694996d652990b02c76e50a95fe5107d9"} Apr 22 14:38:05.659148 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.659134 2579 scope.go:117] "RemoveContainer" containerID="7af838d8245120bb7bc9fc6cdee585f7e8092e33c7732884616084bc8fa90fb0" Apr 22 14:38:05.679715 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.679691 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths"] Apr 22 14:38:05.683480 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.683461 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-33e8e-5c55857f76-xpths"] Apr 22 14:38:05.692463 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.692439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.694500 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.694473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls\") pod \"ensemble-graph-4547e-65765f9c8-v4n9c\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.760255 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.760230 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a41e80-8086-468e-a98d-7444e1af730a" path="/var/lib/kubelet/pods/96a41e80-8086-468e-a98d-7444e1af730a/volumes" Apr 22 14:38:05.863914 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.863889 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:05.976938 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:05.976897 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c"] Apr 22 14:38:05.981201 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:38:05.981171 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20949593_c6e2_4226_b094_727a339e479b.slice/crio-b57c7b8e539ddb54bfd7eff0c5a7383c245fca71a4a4218e6e97aa59667dbf8a WatchSource:0}: Error finding container b57c7b8e539ddb54bfd7eff0c5a7383c245fca71a4a4218e6e97aa59667dbf8a: Status 404 returned error can't find the container with id b57c7b8e539ddb54bfd7eff0c5a7383c245fca71a4a4218e6e97aa59667dbf8a Apr 22 14:38:06.574484 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:06.574449 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 14:38:06.662983 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:06.662955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" event={"ID":"20949593-c6e2-4226-b094-727a339e479b","Type":"ContainerStarted","Data":"95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022"} Apr 22 14:38:06.663368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:06.662989 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" event={"ID":"20949593-c6e2-4226-b094-727a339e479b","Type":"ContainerStarted","Data":"b57c7b8e539ddb54bfd7eff0c5a7383c245fca71a4a4218e6e97aa59667dbf8a"} Apr 22 14:38:06.663368 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:06.663019 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:06.680357 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:06.680222 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podStartSLOduration=2.680207937 podStartE2EDuration="2.680207937s" podCreationTimestamp="2026-04-22 14:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:38:06.679531929 +0000 UTC m=+1337.482617886" watchObservedRunningTime="2026-04-22 14:38:06.680207937 +0000 UTC m=+1337.483293895" Apr 22 14:38:12.671849 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:12.671813 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:38:16.574694 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:16.574652 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 14:38:26.574998 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:26.574972 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" Apr 22 14:38:44.765188 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.765158 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml"] Apr 22 14:38:44.765615 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.765520 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" Apr 22 14:38:44.765615 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.765536 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" Apr 22 14:38:44.765615 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.765614 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="96a41e80-8086-468e-a98d-7444e1af730a" containerName="sequence-graph-33e8e" Apr 22 14:38:44.768391 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.768375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:44.771044 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.771023 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-0eb25-kube-rbac-proxy-sar-config\"" Apr 22 14:38:44.771135 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.771096 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-0eb25-serving-cert\"" Apr 22 14:38:44.778663 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.778643 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml"] Apr 22 14:38:44.873940 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.873907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-openshift-service-ca-bundle\") pod \"sequence-graph-0eb25-cb766f4b9-gvmml\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:44.874090 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.873956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-proxy-tls\") pod \"sequence-graph-0eb25-cb766f4b9-gvmml\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:44.974558 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.974532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-openshift-service-ca-bundle\") pod \"sequence-graph-0eb25-cb766f4b9-gvmml\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:44.974706 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.974579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-proxy-tls\") pod \"sequence-graph-0eb25-cb766f4b9-gvmml\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:44.975244 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.975221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-openshift-service-ca-bundle\") pod \"sequence-graph-0eb25-cb766f4b9-gvmml\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:44.977088 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:44.977064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-proxy-tls\") pod \"sequence-graph-0eb25-cb766f4b9-gvmml\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:45.079954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:45.079874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:45.194655 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:45.194591 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml"] Apr 22 14:38:45.197182 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:38:45.197160 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90cc6707_c4c6_42fd_8bf5_f5fbdfdead13.slice/crio-f4af91fbd22efe53eb4736ed8d026ecffaf38cb92303999bf88b81219fdfac22 WatchSource:0}: Error finding container f4af91fbd22efe53eb4736ed8d026ecffaf38cb92303999bf88b81219fdfac22: Status 404 returned error can't find the container with id f4af91fbd22efe53eb4736ed8d026ecffaf38cb92303999bf88b81219fdfac22 Apr 22 14:38:45.774954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:45.774915 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" event={"ID":"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13","Type":"ContainerStarted","Data":"44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba"} Apr 22 14:38:45.774954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:45.774957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" event={"ID":"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13","Type":"ContainerStarted","Data":"f4af91fbd22efe53eb4736ed8d026ecffaf38cb92303999bf88b81219fdfac22"} Apr 22 14:38:45.775393 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:45.775075 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:38:45.793412 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:45.793368 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podStartSLOduration=1.793354742 podStartE2EDuration="1.793354742s" podCreationTimestamp="2026-04-22 14:38:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:38:45.7918872 +0000 UTC m=+1376.594973158" watchObservedRunningTime="2026-04-22 14:38:45.793354742 +0000 UTC m=+1376.596440742" Apr 22 14:38:51.782359 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:38:51.782333 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:46:19.789084 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:19.789051 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c"] Apr 22 14:46:19.789527 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:19.789239 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" containerID="cri-o://95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022" gracePeriod=30 Apr 22 14:46:20.070022 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.069952 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq"] Apr 22 14:46:20.073027 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.073011 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" Apr 22 14:46:20.077230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.077206 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp"] Apr 22 14:46:20.077412 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.077394 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" containerID="cri-o://41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d" gracePeriod=30 Apr 22 14:46:20.082379 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.082359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" Apr 22 14:46:20.085241 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.085216 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq"] Apr 22 14:46:20.198392 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.198320 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq"] Apr 22 14:46:20.201489 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:46:20.201462 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d4c1fb_542a_4c62_9d43_ba68f3c25ce8.slice/crio-95a7b352348694d19fb814abc62170011baaaa43b790d958db617808eec8013a WatchSource:0}: Error finding container 95a7b352348694d19fb814abc62170011baaaa43b790d958db617808eec8013a: Status 404 returned error can't find the container with id 95a7b352348694d19fb814abc62170011baaaa43b790d958db617808eec8013a Apr 22 14:46:20.203228 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:20.203208 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:46:21.035903 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:21.035867 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" event={"ID":"76d4c1fb-542a-4c62-9d43-ba68f3c25ce8","Type":"ContainerStarted","Data":"8642ea1e3740733eef056927c132e144a45a1a07a5619d43cc712ff96f45ef6b"} Apr 22 14:46:21.036294 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:21.035910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" event={"ID":"76d4c1fb-542a-4c62-9d43-ba68f3c25ce8","Type":"ContainerStarted","Data":"95a7b352348694d19fb814abc62170011baaaa43b790d958db617808eec8013a"} Apr 22 14:46:21.036294 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:21.036076 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" Apr 22 14:46:21.037448 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:21.037402 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 14:46:21.054200 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:21.054153 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podStartSLOduration=1.054139498 podStartE2EDuration="1.054139498s" podCreationTimestamp="2026-04-22 14:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:46:21.052896699 +0000 UTC m=+1831.855982654" watchObservedRunningTime="2026-04-22 14:46:21.054139498 +0000 UTC m=+1831.857225456" Apr 22 14:46:22.039582 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:22.039544 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 14:46:22.606518 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:22.606499 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" Apr 22 14:46:22.670555 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:22.670523 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:46:23.042759 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.042726 2579 generic.go:358] "Generic (PLEG): container finished" podID="b29239c9-642b-4958-ae16-a2649264b49c" containerID="41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d" exitCode=0 Apr 22 14:46:23.043159 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.042806 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" Apr 22 14:46:23.043159 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.042809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" event={"ID":"b29239c9-642b-4958-ae16-a2649264b49c","Type":"ContainerDied","Data":"41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d"} Apr 22 14:46:23.043159 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.042864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp" event={"ID":"b29239c9-642b-4958-ae16-a2649264b49c","Type":"ContainerDied","Data":"14447fcc09e05e0dc5e35b7611f88fc015ae7662dd2cef39e8582c6212ce3e65"} Apr 22 14:46:23.043159 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.042883 2579 scope.go:117] "RemoveContainer" containerID="41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d" Apr 22 14:46:23.050520 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.050419 2579 scope.go:117] "RemoveContainer" containerID="41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d" Apr 22 14:46:23.050746 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:46:23.050728 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d\": container with ID starting with 41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d not found: ID does not exist" containerID="41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d" Apr 22 14:46:23.050804 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.050758 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d"} err="failed to get container status \"41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d\": rpc error: code = NotFound desc = could not find container \"41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d\": container with ID starting with 41829bc7aab43061aaba5675bedd9bb6ac721724ae8b0810176075bfc33d748d not found: ID does not exist" Apr 22 14:46:23.063982 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.063957 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp"] Apr 22 14:46:23.065718 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.065696 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4547e-predictor-5b6cb6694f-b9snp"] Apr 22 14:46:23.760747 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:23.760712 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29239c9-642b-4958-ae16-a2649264b49c" path="/var/lib/kubelet/pods/b29239c9-642b-4958-ae16-a2649264b49c/volumes" Apr 22 14:46:27.670041 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:27.669993 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:46:32.040607 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:32.040555 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 14:46:32.670444 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:32.670397 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:46:32.670603 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:32.670504 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:46:37.670310 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:37.670271 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:46:42.040566 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:42.040526 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 14:46:42.669875 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:42.669824 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:46:47.670682 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:47.670644 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:46:49.928670 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:49.928651 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:46:50.018624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.018600 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls\") pod \"20949593-c6e2-4226-b094-727a339e479b\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " Apr 22 14:46:50.020903 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.020876 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "20949593-c6e2-4226-b094-727a339e479b" (UID: "20949593-c6e2-4226-b094-727a339e479b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:46:50.119176 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.119119 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20949593-c6e2-4226-b094-727a339e479b-openshift-service-ca-bundle\") pod \"20949593-c6e2-4226-b094-727a339e479b\" (UID: \"20949593-c6e2-4226-b094-727a339e479b\") " Apr 22 14:46:50.119273 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.119262 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20949593-c6e2-4226-b094-727a339e479b-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:46:50.119420 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.119398 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20949593-c6e2-4226-b094-727a339e479b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "20949593-c6e2-4226-b094-727a339e479b" (UID: "20949593-c6e2-4226-b094-727a339e479b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:46:50.122985 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.122959 2579 generic.go:358] "Generic (PLEG): container finished" podID="20949593-c6e2-4226-b094-727a339e479b" containerID="95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022" exitCode=0 Apr 22 14:46:50.123083 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.123025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" event={"ID":"20949593-c6e2-4226-b094-727a339e479b","Type":"ContainerDied","Data":"95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022"} Apr 22 14:46:50.123083 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.123052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" event={"ID":"20949593-c6e2-4226-b094-727a339e479b","Type":"ContainerDied","Data":"b57c7b8e539ddb54bfd7eff0c5a7383c245fca71a4a4218e6e97aa59667dbf8a"} Apr 22 14:46:50.123083 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.123068 2579 scope.go:117] "RemoveContainer" containerID="95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022" Apr 22 14:46:50.123237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.123029 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c" Apr 22 14:46:50.131168 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.131027 2579 scope.go:117] "RemoveContainer" containerID="95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022" Apr 22 14:46:50.131314 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:46:50.131290 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022\": container with ID starting with 95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022 not found: ID does not exist" containerID="95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022" Apr 22 14:46:50.131401 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.131325 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022"} err="failed to get container status \"95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022\": rpc error: code = NotFound desc = could not find container \"95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022\": container with ID starting with 95971131230d31441f36237b8bc822a8e71566648cfed3f838645d359b370022 not found: ID does not exist" Apr 22 14:46:50.145716 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.145694 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c"] Apr 22 14:46:50.150816 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.150797 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4547e-65765f9c8-v4n9c"] Apr 22 14:46:50.220384 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:50.220365 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20949593-c6e2-4226-b094-727a339e479b-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.760479 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:51.760446 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20949593-c6e2-4226-b094-727a339e479b" path="/var/lib/kubelet/pods/20949593-c6e2-4226-b094-727a339e479b/volumes" Apr 22 14:46:52.040123 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:52.040039 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 14:46:59.569036 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.569005 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml"] Apr 22 14:46:59.569480 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.569232 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" containerID="cri-o://44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba" gracePeriod=30 Apr 22 14:46:59.714592 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.714561 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b"] Apr 22 14:46:59.714787 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.714767 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" containerID="cri-o://14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110" gracePeriod=30 Apr 22 14:46:59.836158 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836084 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9"] Apr 22 14:46:59.836423 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836411 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" Apr 22 14:46:59.836463 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836425 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" Apr 22 14:46:59.836463 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836449 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" Apr 22 14:46:59.836463 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836456 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" Apr 22 14:46:59.836552 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836507 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b29239c9-642b-4958-ae16-a2649264b49c" containerName="kserve-container" Apr 22 14:46:59.836552 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.836515 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="20949593-c6e2-4226-b094-727a339e479b" containerName="ensemble-graph-4547e" Apr 22 14:46:59.841031 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.841012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" Apr 22 14:46:59.846438 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.846416 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9"] Apr 22 14:46:59.852038 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.852014 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" Apr 22 14:46:59.971617 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:46:59.971593 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9"] Apr 22 14:46:59.974020 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:46:59.973988 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc9f5d6_3ebf_4867_a668_9d844efd1d0f.slice/crio-ad45bdc947e66daa54a58f6cef90daad0678f433c417518021f518ccec629937 WatchSource:0}: Error finding container ad45bdc947e66daa54a58f6cef90daad0678f433c417518021f518ccec629937: Status 404 returned error can't find the container with id ad45bdc947e66daa54a58f6cef90daad0678f433c417518021f518ccec629937 Apr 22 14:47:00.154223 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:00.154157 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" event={"ID":"8fc9f5d6-3ebf-4867-a668-9d844efd1d0f","Type":"ContainerStarted","Data":"598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537"} Apr 22 14:47:00.154223 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:00.154197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" event={"ID":"8fc9f5d6-3ebf-4867-a668-9d844efd1d0f","Type":"ContainerStarted","Data":"ad45bdc947e66daa54a58f6cef90daad0678f433c417518021f518ccec629937"} Apr 22 14:47:00.154380 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:00.154270 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" Apr 22 14:47:00.155659 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:00.155635 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 14:47:00.171363 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:00.171323 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podStartSLOduration=1.1713114 podStartE2EDuration="1.1713114s" podCreationTimestamp="2026-04-22 14:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:00.169897636 +0000 UTC m=+1870.972983605" watchObservedRunningTime="2026-04-22 14:47:00.1713114 +0000 UTC m=+1870.974397424" Apr 22 14:47:01.156568 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:01.156534 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 14:47:01.781792 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:01.781757 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:02.041096 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:02.041016 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" Apr 22 14:47:02.639850 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:47:02.639806 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode386a23c_decf_4288_90e5_d43c0090ea28.slice/crio-14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:47:02.846613 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:02.846594 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" Apr 22 14:47:03.167005 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.166964 2579 generic.go:358] "Generic (PLEG): container finished" podID="e386a23c-decf-4288-90e5-d43c0090ea28" containerID="14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110" exitCode=0 Apr 22 14:47:03.167166 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.167053 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" Apr 22 14:47:03.167166 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.167055 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" event={"ID":"e386a23c-decf-4288-90e5-d43c0090ea28","Type":"ContainerDied","Data":"14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110"} Apr 22 14:47:03.167166 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.167150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b" event={"ID":"e386a23c-decf-4288-90e5-d43c0090ea28","Type":"ContainerDied","Data":"27a72cf32e2fd7aac216254175c0ef0e6b51294baeeed1a8243e0bf1e4fe0156"} Apr 22 14:47:03.167317 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.167173 2579 scope.go:117] "RemoveContainer" containerID="14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110" Apr 22 14:47:03.174897 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.174697 2579 scope.go:117] "RemoveContainer" containerID="14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110" Apr 22 14:47:03.175080 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:47:03.175053 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110\": container with ID starting with 14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110 not found: ID does not exist" containerID="14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110" Apr 22 14:47:03.175174 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.175086 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110"} err="failed to get container status \"14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110\": rpc error: code = NotFound desc = could not find container \"14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110\": container with ID starting with 14f42a4008bbd665460fead2b01457970349ac32530d668ed67e03571ece5110 not found: ID does not exist" Apr 22 14:47:03.187989 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.187967 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b"] Apr 22 14:47:03.193447 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.193425 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0eb25-predictor-f6f5cc7f5-fbr4b"] Apr 22 14:47:03.759905 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:03.759875 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" path="/var/lib/kubelet/pods/e386a23c-decf-4288-90e5-d43c0090ea28/volumes" Apr 22 14:47:06.781744 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:06.781675 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:11.156721 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:11.156686 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 14:47:11.781609 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:11.781562 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:11.781767 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:11.781673 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:47:16.781359 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:16.781319 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:21.157127 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:21.157087 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 14:47:21.782027 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:21.781983 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:26.781441 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:26.781397 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:29.705281 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.705259 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:47:29.812697 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.812666 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-openshift-service-ca-bundle\") pod \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " Apr 22 14:47:29.812866 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.812735 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-proxy-tls\") pod \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\" (UID: \"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13\") " Apr 22 14:47:29.813015 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.812994 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" (UID: "90cc6707-c4c6-42fd-8bf5-f5fbdfdead13"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:47:29.814624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.814604 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" (UID: "90cc6707-c4c6-42fd-8bf5-f5fbdfdead13"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:47:29.913446 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.913371 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:47:29.913446 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:29.913403 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:47:30.050283 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050256 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6"] Apr 22 14:47:30.050579 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050568 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" Apr 22 14:47:30.050624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050581 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" Apr 22 14:47:30.050624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050594 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" Apr 22 14:47:30.050624 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050599 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" Apr 22 14:47:30.050713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050642 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e386a23c-decf-4288-90e5-d43c0090ea28" containerName="kserve-container" Apr 22 14:47:30.050713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.050650 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerName="sequence-graph-0eb25" Apr 22 14:47:30.053602 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.053585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.056442 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.056420 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-f4200-kube-rbac-proxy-sar-config\"" Apr 22 14:47:30.056561 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.056480 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-f4200-serving-cert\"" Apr 22 14:47:30.060721 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.060699 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6"] Apr 22 14:47:30.114106 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.114079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.114230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.114128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-openshift-service-ca-bundle\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.215423 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.215362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.215423 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.215413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-openshift-service-ca-bundle\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.215559 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:47:30.215505 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-f4200-serving-cert: secret "splitter-graph-f4200-serving-cert" not found Apr 22 14:47:30.215597 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:47:30.215590 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls podName:d6dbfab9-6aed-49fc-81a5-abfd0cfa6851 nodeName:}" failed. No retries permitted until 2026-04-22 14:47:30.715572598 +0000 UTC m=+1901.518658533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls") pod "splitter-graph-f4200-749f97c4cf-6f7k6" (UID: "d6dbfab9-6aed-49fc-81a5-abfd0cfa6851") : secret "splitter-graph-f4200-serving-cert" not found Apr 22 14:47:30.215969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.215952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-openshift-service-ca-bundle\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.243362 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.243334 2579 generic.go:358] "Generic (PLEG): container finished" podID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" containerID="44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba" exitCode=0 Apr 22 14:47:30.243470 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.243415 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" Apr 22 14:47:30.243519 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.243415 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" event={"ID":"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13","Type":"ContainerDied","Data":"44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba"} Apr 22 14:47:30.243558 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.243523 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml" event={"ID":"90cc6707-c4c6-42fd-8bf5-f5fbdfdead13","Type":"ContainerDied","Data":"f4af91fbd22efe53eb4736ed8d026ecffaf38cb92303999bf88b81219fdfac22"} Apr 22 14:47:30.243558 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.243543 2579 scope.go:117] "RemoveContainer" containerID="44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba" Apr 22 14:47:30.252084 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.252064 2579 scope.go:117] "RemoveContainer" containerID="44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba" Apr 22 14:47:30.252328 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:47:30.252308 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba\": container with ID starting with 44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba not found: ID does not exist" containerID="44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba" Apr 22 14:47:30.252397 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.252341 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba"} err="failed to get container status \"44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba\": rpc error: code = NotFound desc = could not find container \"44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba\": container with ID starting with 44245dd1e6242524547d5b05be1376d1d3c2f0259c7e2149b4a99945e9498eba not found: ID does not exist" Apr 22 14:47:30.265137 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.265115 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml"] Apr 22 14:47:30.271052 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.271031 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0eb25-cb766f4b9-gvmml"] Apr 22 14:47:30.719335 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.719299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.721583 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.721563 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls\") pod \"splitter-graph-f4200-749f97c4cf-6f7k6\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:30.963567 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:30.963540 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:31.077415 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.077386 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6"] Apr 22 14:47:31.080731 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:47:31.080705 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-4b8679ad2c793f4dfff5e0b8d5ea7a743d4615c79911a2cf55bf99461a61a28a WatchSource:0}: Error finding container 4b8679ad2c793f4dfff5e0b8d5ea7a743d4615c79911a2cf55bf99461a61a28a: Status 404 returned error can't find the container with id 4b8679ad2c793f4dfff5e0b8d5ea7a743d4615c79911a2cf55bf99461a61a28a Apr 22 14:47:31.157135 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.157098 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 14:47:31.247826 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.247789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" event={"ID":"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851","Type":"ContainerStarted","Data":"9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9"} Apr 22 14:47:31.248000 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.247827 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" event={"ID":"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851","Type":"ContainerStarted","Data":"4b8679ad2c793f4dfff5e0b8d5ea7a743d4615c79911a2cf55bf99461a61a28a"} Apr 22 14:47:31.248000 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.247872 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:31.265068 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.265021 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podStartSLOduration=1.265005922 podStartE2EDuration="1.265005922s" podCreationTimestamp="2026-04-22 14:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:31.264447974 +0000 UTC m=+1902.067533931" watchObservedRunningTime="2026-04-22 14:47:31.265005922 +0000 UTC m=+1902.068091861" Apr 22 14:47:31.762087 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:31.762051 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cc6707-c4c6-42fd-8bf5-f5fbdfdead13" path="/var/lib/kubelet/pods/90cc6707-c4c6-42fd-8bf5-f5fbdfdead13/volumes" Apr 22 14:47:37.255630 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:37.255603 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:40.194793 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.194762 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6"] Apr 22 14:47:40.195237 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.195074 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" containerID="cri-o://9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9" gracePeriod=30 Apr 22 14:47:40.426315 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.426280 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc"] Apr 22 14:47:40.430196 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.430179 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" Apr 22 14:47:40.438693 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.438670 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc"] Apr 22 14:47:40.440581 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.440562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" Apr 22 14:47:40.446061 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.446009 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq"] Apr 22 14:47:40.446232 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.446211 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" containerID="cri-o://8642ea1e3740733eef056927c132e144a45a1a07a5619d43cc712ff96f45ef6b" gracePeriod=30 Apr 22 14:47:40.555910 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:40.555827 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc"] Apr 22 14:47:40.559456 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:47:40.559427 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb43a9a_3e61_4a9e_8a98_6db4b0437e65.slice/crio-01010b525feac75a7be699d68cc026a8baa043103f2f454419c5523049ba4f04 WatchSource:0}: Error finding container 01010b525feac75a7be699d68cc026a8baa043103f2f454419c5523049ba4f04: Status 404 returned error can't find the container with id 01010b525feac75a7be699d68cc026a8baa043103f2f454419c5523049ba4f04 Apr 22 14:47:41.156970 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:41.156927 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 14:47:41.277236 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:41.277199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" event={"ID":"1cb43a9a-3e61-4a9e-8a98-6db4b0437e65","Type":"ContainerStarted","Data":"2944c78075c52c0330b7412e7e398174f7a4669d23d96b94a1ec133b80c6a421"} Apr 22 14:47:41.277236 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:41.277235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" event={"ID":"1cb43a9a-3e61-4a9e-8a98-6db4b0437e65","Type":"ContainerStarted","Data":"01010b525feac75a7be699d68cc026a8baa043103f2f454419c5523049ba4f04"} Apr 22 14:47:41.277709 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:41.277418 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" Apr 22 14:47:41.278685 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:41.278658 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 14:47:41.294860 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:41.294807 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podStartSLOduration=1.294793783 podStartE2EDuration="1.294793783s" podCreationTimestamp="2026-04-22 14:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:41.293703657 +0000 UTC m=+1912.096789624" watchObservedRunningTime="2026-04-22 14:47:41.294793783 +0000 UTC m=+1912.097879739" Apr 22 14:47:42.040045 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:42.040006 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 14:47:42.254991 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:42.254953 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:42.280635 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:42.280593 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 14:47:43.285081 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:43.285050 2579 generic.go:358] "Generic (PLEG): container finished" podID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerID="8642ea1e3740733eef056927c132e144a45a1a07a5619d43cc712ff96f45ef6b" exitCode=0 Apr 22 14:47:43.285381 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:43.285112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" event={"ID":"76d4c1fb-542a-4c62-9d43-ba68f3c25ce8","Type":"ContainerDied","Data":"8642ea1e3740733eef056927c132e144a45a1a07a5619d43cc712ff96f45ef6b"} Apr 22 14:47:43.382188 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:43.382167 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" Apr 22 14:47:44.288711 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:44.288677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" event={"ID":"76d4c1fb-542a-4c62-9d43-ba68f3c25ce8","Type":"ContainerDied","Data":"95a7b352348694d19fb814abc62170011baaaa43b790d958db617808eec8013a"} Apr 22 14:47:44.288711 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:44.288705 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq" Apr 22 14:47:44.289202 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:44.288721 2579 scope.go:117] "RemoveContainer" containerID="8642ea1e3740733eef056927c132e144a45a1a07a5619d43cc712ff96f45ef6b" Apr 22 14:47:44.308735 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:44.308578 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq"] Apr 22 14:47:44.310193 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:44.310172 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f4200-predictor-86dfdc759f-mxddq"] Apr 22 14:47:45.760969 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:45.760936 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" path="/var/lib/kubelet/pods/76d4c1fb-542a-4c62-9d43-ba68f3c25ce8/volumes" Apr 22 14:47:47.254880 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:47.254824 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:51.157810 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:51.157782 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" Apr 22 14:47:52.254310 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:52.254266 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:47:52.254676 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:52.254375 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:47:52.281211 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:52.281183 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 14:47:57.254961 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:47:57.254919 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:48:02.254396 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:02.254356 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:48:02.281365 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:02.281335 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 14:48:07.254491 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:07.254453 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:48:09.761785 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.761747 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7"] Apr 22 14:48:09.762153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.762074 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" Apr 22 14:48:09.762153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.762084 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" Apr 22 14:48:09.762153 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.762136 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="76d4c1fb-542a-4c62-9d43-ba68f3c25ce8" containerName="kserve-container" Apr 22 14:48:09.766413 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.766398 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:09.769383 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.769361 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-4e291-kube-rbac-proxy-sar-config\"" Apr 22 14:48:09.769477 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.769394 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-4e291-serving-cert\"" Apr 22 14:48:09.772142 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.772123 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7"] Apr 22 14:48:09.812963 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.812940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:09.813089 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.812987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6062ebe-8887-4cb7-b11c-af631c1f9061-openshift-service-ca-bundle\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:09.913284 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.913254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6062ebe-8887-4cb7-b11c-af631c1f9061-openshift-service-ca-bundle\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:09.913405 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.913318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:09.913459 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:48:09.913444 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-4e291-serving-cert: secret "switch-graph-4e291-serving-cert" not found Apr 22 14:48:09.913526 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:48:09.913515 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls podName:d6062ebe-8887-4cb7-b11c-af631c1f9061 nodeName:}" failed. No retries permitted until 2026-04-22 14:48:10.413491953 +0000 UTC m=+1941.216577894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls") pod "switch-graph-4e291-65ccfb5fb-lzpv7" (UID: "d6062ebe-8887-4cb7-b11c-af631c1f9061") : secret "switch-graph-4e291-serving-cert" not found Apr 22 14:48:09.913968 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:09.913953 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6062ebe-8887-4cb7-b11c-af631c1f9061-openshift-service-ca-bundle\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:10.217021 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:48:10.216990 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-conmon-9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:48:10.217021 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:48:10.216997 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-conmon-9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:48:10.217205 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:48:10.217018 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-conmon-9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6dbfab9_6aed_49fc_81a5_abfd0cfa6851.slice/crio-9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:48:10.338079 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.338055 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:48:10.361455 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.361423 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerID="9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9" exitCode=0 Apr 22 14:48:10.361571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.361482 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" event={"ID":"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851","Type":"ContainerDied","Data":"9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9"} Apr 22 14:48:10.361571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.361489 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" Apr 22 14:48:10.361571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.361513 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6" event={"ID":"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851","Type":"ContainerDied","Data":"4b8679ad2c793f4dfff5e0b8d5ea7a743d4615c79911a2cf55bf99461a61a28a"} Apr 22 14:48:10.361571 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.361532 2579 scope.go:117] "RemoveContainer" containerID="9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9" Apr 22 14:48:10.369714 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.369697 2579 scope.go:117] "RemoveContainer" containerID="9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9" Apr 22 14:48:10.369997 ip-10-0-137-106 kubenswrapper[2579]: E0422 14:48:10.369975 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9\": container with ID starting with 9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9 not found: ID does not exist" containerID="9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9" Apr 22 14:48:10.370098 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.370005 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9"} err="failed to get container status \"9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9\": rpc error: code = NotFound desc = could not find container \"9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9\": container with ID starting with 9768e4f18d61b5578757c66298675222e34c6815f191d46359f7e3ad047f7ec9 not found: ID does not exist" Apr 22 14:48:10.416923 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.416896 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls\") pod \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " Apr 22 14:48:10.417061 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.417048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:10.418743 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.418718 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" (UID: "d6dbfab9-6aed-49fc-81a5-abfd0cfa6851"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:48:10.419229 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.419213 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls\") pod \"switch-graph-4e291-65ccfb5fb-lzpv7\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:10.517399 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.517377 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-openshift-service-ca-bundle\") pod \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\" (UID: \"d6dbfab9-6aed-49fc-81a5-abfd0cfa6851\") " Apr 22 14:48:10.517578 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.517563 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:48:10.517713 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.517694 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" (UID: "d6dbfab9-6aed-49fc-81a5-abfd0cfa6851"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:48:10.618487 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.618457 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:48:10.676421 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.676399 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:10.689059 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.689033 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6"] Apr 22 14:48:10.695904 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.695880 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-f4200-749f97c4cf-6f7k6"] Apr 22 14:48:10.803127 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:10.803055 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7"] Apr 22 14:48:10.806078 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:48:10.806043 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6062ebe_8887_4cb7_b11c_af631c1f9061.slice/crio-c15d545c32bc8873e91421b5b08126655e388acc51036fd757dadd80bd02d67a WatchSource:0}: Error finding container c15d545c32bc8873e91421b5b08126655e388acc51036fd757dadd80bd02d67a: Status 404 returned error can't find the container with id c15d545c32bc8873e91421b5b08126655e388acc51036fd757dadd80bd02d67a Apr 22 14:48:11.371033 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:11.370992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" event={"ID":"d6062ebe-8887-4cb7-b11c-af631c1f9061","Type":"ContainerStarted","Data":"4c2cf21919365ec1b96eb47adbfb5bd52eb19de5da0d7853a0010c880260be2c"} Apr 22 14:48:11.371033 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:11.371036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" event={"ID":"d6062ebe-8887-4cb7-b11c-af631c1f9061","Type":"ContainerStarted","Data":"c15d545c32bc8873e91421b5b08126655e388acc51036fd757dadd80bd02d67a"} Apr 22 14:48:11.371301 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:11.371141 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:11.390192 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:11.390140 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podStartSLOduration=2.390124484 podStartE2EDuration="2.390124484s" podCreationTimestamp="2026-04-22 14:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:48:11.388776115 +0000 UTC m=+1942.191862092" watchObservedRunningTime="2026-04-22 14:48:11.390124484 +0000 UTC m=+1942.193210440" Apr 22 14:48:11.761090 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:11.761051 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" path="/var/lib/kubelet/pods/d6dbfab9-6aed-49fc-81a5-abfd0cfa6851/volumes" Apr 22 14:48:12.281551 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:12.281513 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 14:48:17.379674 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:17.379649 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 14:48:22.281245 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:22.281202 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 14:48:32.281923 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:32.281887 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" Apr 22 14:48:50.499750 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.499720 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv"] Apr 22 14:48:50.500136 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.500080 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" Apr 22 14:48:50.500136 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.500092 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" Apr 22 14:48:50.500204 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.500138 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6dbfab9-6aed-49fc-81a5-abfd0cfa6851" containerName="splitter-graph-f4200" Apr 22 14:48:50.502970 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.502954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.506604 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.506582 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-055dc-kube-rbac-proxy-sar-config\"" Apr 22 14:48:50.508529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.508512 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-055dc-serving-cert\"" Apr 22 14:48:50.519466 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.519443 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv"] Apr 22 14:48:50.620945 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.620914 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976866d0-ceea-428e-b669-8b5641eb6c17-openshift-service-ca-bundle\") pod \"splitter-graph-055dc-576fcf5468-jgbgv\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.620945 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.620945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/976866d0-ceea-428e-b669-8b5641eb6c17-proxy-tls\") pod \"splitter-graph-055dc-576fcf5468-jgbgv\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.721598 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.721565 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976866d0-ceea-428e-b669-8b5641eb6c17-openshift-service-ca-bundle\") pod \"splitter-graph-055dc-576fcf5468-jgbgv\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.721598 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.721597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/976866d0-ceea-428e-b669-8b5641eb6c17-proxy-tls\") pod \"splitter-graph-055dc-576fcf5468-jgbgv\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.722233 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.722209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976866d0-ceea-428e-b669-8b5641eb6c17-openshift-service-ca-bundle\") pod \"splitter-graph-055dc-576fcf5468-jgbgv\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.723859 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.723825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/976866d0-ceea-428e-b669-8b5641eb6c17-proxy-tls\") pod \"splitter-graph-055dc-576fcf5468-jgbgv\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.812344 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.812265 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:50.933915 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:50.933890 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv"] Apr 22 14:48:50.936996 ip-10-0-137-106 kubenswrapper[2579]: W0422 14:48:50.936971 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod976866d0_ceea_428e_b669_8b5641eb6c17.slice/crio-07e99657d3ef898e506704ead072d46185b4e329b09dd679420caa79c548814b WatchSource:0}: Error finding container 07e99657d3ef898e506704ead072d46185b4e329b09dd679420caa79c548814b: Status 404 returned error can't find the container with id 07e99657d3ef898e506704ead072d46185b4e329b09dd679420caa79c548814b Apr 22 14:48:51.487105 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:51.487071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" event={"ID":"976866d0-ceea-428e-b669-8b5641eb6c17","Type":"ContainerStarted","Data":"51eaff106b1ca8840487e16e9ddccbb1d1dc149e995f09c76881b0e07a7600d0"} Apr 22 14:48:51.487105 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:51.487105 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" event={"ID":"976866d0-ceea-428e-b669-8b5641eb6c17","Type":"ContainerStarted","Data":"07e99657d3ef898e506704ead072d46185b4e329b09dd679420caa79c548814b"} Apr 22 14:48:51.487307 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:51.487201 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:48:51.509189 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:51.509146 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podStartSLOduration=1.509132774 podStartE2EDuration="1.509132774s" podCreationTimestamp="2026-04-22 14:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:48:51.507694785 +0000 UTC m=+1982.310780764" watchObservedRunningTime="2026-04-22 14:48:51.509132774 +0000 UTC m=+1982.312218731" Apr 22 14:48:57.495060 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:48:57.495034 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:57:04.925047 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:04.925007 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv"] Apr 22 14:57:04.927360 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:04.925296 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" containerID="cri-o://51eaff106b1ca8840487e16e9ddccbb1d1dc149e995f09c76881b0e07a7600d0" gracePeriod=30 Apr 22 14:57:05.085529 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:05.085499 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc"] Apr 22 14:57:05.085744 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:05.085722 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" containerID="cri-o://2944c78075c52c0330b7412e7e398174f7a4669d23d96b94a1ec133b80c6a421" gracePeriod=30 Apr 22 14:57:07.493558 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:07.493520 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:57:07.876311 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:07.876278 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerID="2944c78075c52c0330b7412e7e398174f7a4669d23d96b94a1ec133b80c6a421" exitCode=0 Apr 22 14:57:07.876439 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:07.876321 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" event={"ID":"1cb43a9a-3e61-4a9e-8a98-6db4b0437e65","Type":"ContainerDied","Data":"2944c78075c52c0330b7412e7e398174f7a4669d23d96b94a1ec133b80c6a421"} Apr 22 14:57:08.421046 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:08.421025 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" Apr 22 14:57:08.880090 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:08.880056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" event={"ID":"1cb43a9a-3e61-4a9e-8a98-6db4b0437e65","Type":"ContainerDied","Data":"01010b525feac75a7be699d68cc026a8baa043103f2f454419c5523049ba4f04"} Apr 22 14:57:08.880090 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:08.880090 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc" Apr 22 14:57:08.880524 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:08.880097 2579 scope.go:117] "RemoveContainer" containerID="2944c78075c52c0330b7412e7e398174f7a4669d23d96b94a1ec133b80c6a421" Apr 22 14:57:08.905037 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:08.905012 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc"] Apr 22 14:57:08.910876 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:08.910855 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-055dc-predictor-54dcc78869-9lbzc"] Apr 22 14:57:09.761281 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:09.761248 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" path="/var/lib/kubelet/pods/1cb43a9a-3e61-4a9e-8a98-6db4b0437e65/volumes" Apr 22 14:57:12.493944 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:12.493896 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:57:17.493500 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:17.493464 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:57:17.493928 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:17.493558 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:57:22.493754 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:22.493707 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:57:27.493586 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:27.493544 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:57:32.494093 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:32.494047 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:57:34.952705 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:34.952675 2579 generic.go:358] "Generic (PLEG): container finished" podID="976866d0-ceea-428e-b669-8b5641eb6c17" containerID="51eaff106b1ca8840487e16e9ddccbb1d1dc149e995f09c76881b0e07a7600d0" exitCode=0 Apr 22 14:57:34.953091 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:34.952722 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" event={"ID":"976866d0-ceea-428e-b669-8b5641eb6c17","Type":"ContainerDied","Data":"51eaff106b1ca8840487e16e9ddccbb1d1dc149e995f09c76881b0e07a7600d0"} Apr 22 14:57:35.057597 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.057576 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:57:35.127902 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.127877 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976866d0-ceea-428e-b669-8b5641eb6c17-openshift-service-ca-bundle\") pod \"976866d0-ceea-428e-b669-8b5641eb6c17\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " Apr 22 14:57:35.127902 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.127905 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/976866d0-ceea-428e-b669-8b5641eb6c17-proxy-tls\") pod \"976866d0-ceea-428e-b669-8b5641eb6c17\" (UID: \"976866d0-ceea-428e-b669-8b5641eb6c17\") " Apr 22 14:57:35.128230 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.128198 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/976866d0-ceea-428e-b669-8b5641eb6c17-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "976866d0-ceea-428e-b669-8b5641eb6c17" (UID: "976866d0-ceea-428e-b669-8b5641eb6c17"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:57:35.129769 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.129749 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976866d0-ceea-428e-b669-8b5641eb6c17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "976866d0-ceea-428e-b669-8b5641eb6c17" (UID: "976866d0-ceea-428e-b669-8b5641eb6c17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:57:35.228954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.228886 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976866d0-ceea-428e-b669-8b5641eb6c17-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:57:35.228954 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.228912 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/976866d0-ceea-428e-b669-8b5641eb6c17-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 14:57:35.956206 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.956178 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" Apr 22 14:57:35.956621 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.956173 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv" event={"ID":"976866d0-ceea-428e-b669-8b5641eb6c17","Type":"ContainerDied","Data":"07e99657d3ef898e506704ead072d46185b4e329b09dd679420caa79c548814b"} Apr 22 14:57:35.956621 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.956308 2579 scope.go:117] "RemoveContainer" containerID="51eaff106b1ca8840487e16e9ddccbb1d1dc149e995f09c76881b0e07a7600d0" Apr 22 14:57:35.973691 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.973663 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv"] Apr 22 14:57:35.979305 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:35.979283 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-055dc-576fcf5468-jgbgv"] Apr 22 14:57:37.760386 ip-10-0-137-106 kubenswrapper[2579]: I0422 14:57:37.760347 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" path="/var/lib/kubelet/pods/976866d0-ceea-428e-b669-8b5641eb6c17/volumes" Apr 22 15:04:29.139380 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:29.139344 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7"] Apr 22 15:04:29.141678 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:29.139587 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" containerID="cri-o://4c2cf21919365ec1b96eb47adbfb5bd52eb19de5da0d7853a0010c880260be2c" gracePeriod=30 Apr 22 15:04:29.406431 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:29.406359 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9"] Apr 22 15:04:29.406638 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:29.406585 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" containerID="cri-o://598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537" gracePeriod=30 Apr 22 15:04:30.381493 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381466 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nct5s/must-gather-flz5z"] Apr 22 15:04:30.381860 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381793 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" Apr 22 15:04:30.381860 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381807 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" Apr 22 15:04:30.381860 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381829 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" Apr 22 15:04:30.381860 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381855 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" Apr 22 15:04:30.381996 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381941 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="976866d0-ceea-428e-b669-8b5641eb6c17" containerName="splitter-graph-055dc" Apr 22 15:04:30.381996 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.381953 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cb43a9a-3e61-4a9e-8a98-6db4b0437e65" containerName="kserve-container" Apr 22 15:04:30.385050 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.385024 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.388103 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.388078 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nct5s\"/\"kube-root-ca.crt\"" Apr 22 15:04:30.388225 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.388112 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nct5s\"/\"openshift-service-ca.crt\"" Apr 22 15:04:30.388225 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.388134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9f5ef2d-434a-48e8-a23d-e80c68579b68-must-gather-output\") pod \"must-gather-flz5z\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.388225 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.388211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rwzb\" (UniqueName: \"kubernetes.io/projected/e9f5ef2d-434a-48e8-a23d-e80c68579b68-kube-api-access-2rwzb\") pod \"must-gather-flz5z\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.389522 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.389496 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nct5s\"/\"default-dockercfg-dp8wh\"" Apr 22 15:04:30.440402 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.440372 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nct5s/must-gather-flz5z"] Apr 22 15:04:30.488702 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.488676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9f5ef2d-434a-48e8-a23d-e80c68579b68-must-gather-output\") pod \"must-gather-flz5z\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.488855 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.488723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rwzb\" (UniqueName: \"kubernetes.io/projected/e9f5ef2d-434a-48e8-a23d-e80c68579b68-kube-api-access-2rwzb\") pod \"must-gather-flz5z\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.489068 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.489048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9f5ef2d-434a-48e8-a23d-e80c68579b68-must-gather-output\") pod \"must-gather-flz5z\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.498677 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.498651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rwzb\" (UniqueName: \"kubernetes.io/projected/e9f5ef2d-434a-48e8-a23d-e80c68579b68-kube-api-access-2rwzb\") pod \"must-gather-flz5z\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.705825 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.705747 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:04:30.831340 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.831311 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nct5s/must-gather-flz5z"] Apr 22 15:04:30.834304 ip-10-0-137-106 kubenswrapper[2579]: W0422 15:04:30.834274 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9f5ef2d_434a_48e8_a23d_e80c68579b68.slice/crio-5a3fdcf6d44449b4f4b84f35975187f1c6c6b457c01f49d453c14b6e694d0533 WatchSource:0}: Error finding container 5a3fdcf6d44449b4f4b84f35975187f1c6c6b457c01f49d453c14b6e694d0533: Status 404 returned error can't find the container with id 5a3fdcf6d44449b4f4b84f35975187f1c6c6b457c01f49d453c14b6e694d0533 Apr 22 15:04:30.835992 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:30.835972 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:04:31.088253 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:31.088221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nct5s/must-gather-flz5z" event={"ID":"e9f5ef2d-434a-48e8-a23d-e80c68579b68","Type":"ContainerStarted","Data":"5a3fdcf6d44449b4f4b84f35975187f1c6c6b457c01f49d453c14b6e694d0533"} Apr 22 15:04:31.156758 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:31.156726 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 15:04:32.354195 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:32.354172 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" Apr 22 15:04:32.378913 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:32.378881 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:04:33.094414 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.094378 2579 generic.go:358] "Generic (PLEG): container finished" podID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerID="598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537" exitCode=0 Apr 22 15:04:33.094569 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.094454 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" Apr 22 15:04:33.094569 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.094468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" event={"ID":"8fc9f5d6-3ebf-4867-a668-9d844efd1d0f","Type":"ContainerDied","Data":"598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537"} Apr 22 15:04:33.094569 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.094506 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9" event={"ID":"8fc9f5d6-3ebf-4867-a668-9d844efd1d0f","Type":"ContainerDied","Data":"ad45bdc947e66daa54a58f6cef90daad0678f433c417518021f518ccec629937"} Apr 22 15:04:33.094569 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.094522 2579 scope.go:117] "RemoveContainer" containerID="598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537" Apr 22 15:04:33.102038 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.102019 2579 scope.go:117] "RemoveContainer" containerID="598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537" Apr 22 15:04:33.102281 ip-10-0-137-106 kubenswrapper[2579]: E0422 15:04:33.102261 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537\": container with ID starting with 598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537 not found: ID does not exist" containerID="598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537" Apr 22 15:04:33.102362 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.102287 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537"} err="failed to get container status \"598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537\": rpc error: code = NotFound desc = could not find container \"598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537\": container with ID starting with 598a4ceaa00ad05914bde9ab919d7d2202022bc801f7fc62b3827f45655ef537 not found: ID does not exist" Apr 22 15:04:33.114861 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.114824 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9"] Apr 22 15:04:33.120330 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.120308 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4e291-predictor-84859f6fd9-kzwr9"] Apr 22 15:04:33.760868 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:33.760820 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" path="/var/lib/kubelet/pods/8fc9f5d6-3ebf-4867-a668-9d844efd1d0f/volumes" Apr 22 15:04:37.381101 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:37.381058 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:04:38.112367 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:38.112332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nct5s/must-gather-flz5z" event={"ID":"e9f5ef2d-434a-48e8-a23d-e80c68579b68","Type":"ContainerStarted","Data":"490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182"} Apr 22 15:04:38.112367 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:38.112367 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nct5s/must-gather-flz5z" event={"ID":"e9f5ef2d-434a-48e8-a23d-e80c68579b68","Type":"ContainerStarted","Data":"3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930"} Apr 22 15:04:38.132633 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:38.132584 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nct5s/must-gather-flz5z" podStartSLOduration=1.778510475 podStartE2EDuration="8.132565509s" podCreationTimestamp="2026-04-22 15:04:30 +0000 UTC" firstStartedPulling="2026-04-22 15:04:30.836094033 +0000 UTC m=+2921.639179969" lastFinishedPulling="2026-04-22 15:04:37.190149062 +0000 UTC m=+2927.993235003" observedRunningTime="2026-04-22 15:04:38.131907708 +0000 UTC m=+2928.934993688" watchObservedRunningTime="2026-04-22 15:04:38.132565509 +0000 UTC m=+2928.935651466" Apr 22 15:04:42.378325 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:42.378279 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:04:42.378742 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:42.378389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 15:04:45.920571 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:45.920539 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:46.673078 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:46.673035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:47.377968 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:47.377935 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:04:47.491560 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:47.491527 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:48.214089 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:48.214061 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:49.025577 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:49.025547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:49.738099 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:49.738057 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:50.439234 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:50.439204 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:51.144581 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:51.144553 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:51.861724 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:51.861693 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:52.378903 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:52.378871 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:04:52.571657 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:52.571628 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:53.284872 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:53.284823 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:53.985715 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:53.985681 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-4e291-65ccfb5fb-lzpv7_d6062ebe-8887-4cb7-b11c-af631c1f9061/switch-graph-4e291/0.log" Apr 22 15:04:57.173375 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:57.173338 2579 generic.go:358] "Generic (PLEG): container finished" podID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerID="3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930" exitCode=0 Apr 22 15:04:57.173792 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:57.173418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nct5s/must-gather-flz5z" event={"ID":"e9f5ef2d-434a-48e8-a23d-e80c68579b68","Type":"ContainerDied","Data":"3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930"} Apr 22 15:04:57.173792 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:57.173719 2579 scope.go:117] "RemoveContainer" containerID="3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930" Apr 22 15:04:57.378331 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:57.378299 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:04:57.470632 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:57.470544 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nct5s_must-gather-flz5z_e9f5ef2d-434a-48e8-a23d-e80c68579b68/gather/0.log" Apr 22 15:04:59.180637 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.180607 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerID="4c2cf21919365ec1b96eb47adbfb5bd52eb19de5da0d7853a0010c880260be2c" exitCode=0 Apr 22 15:04:59.181007 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.180687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" event={"ID":"d6062ebe-8887-4cb7-b11c-af631c1f9061","Type":"ContainerDied","Data":"4c2cf21919365ec1b96eb47adbfb5bd52eb19de5da0d7853a0010c880260be2c"} Apr 22 15:04:59.270380 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.270359 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 15:04:59.319540 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.319513 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6062ebe-8887-4cb7-b11c-af631c1f9061-openshift-service-ca-bundle\") pod \"d6062ebe-8887-4cb7-b11c-af631c1f9061\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " Apr 22 15:04:59.319660 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.319545 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls\") pod \"d6062ebe-8887-4cb7-b11c-af631c1f9061\" (UID: \"d6062ebe-8887-4cb7-b11c-af631c1f9061\") " Apr 22 15:04:59.319869 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.319821 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6062ebe-8887-4cb7-b11c-af631c1f9061-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d6062ebe-8887-4cb7-b11c-af631c1f9061" (UID: "d6062ebe-8887-4cb7-b11c-af631c1f9061"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:04:59.321459 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.321437 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d6062ebe-8887-4cb7-b11c-af631c1f9061" (UID: "d6062ebe-8887-4cb7-b11c-af631c1f9061"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:04:59.420501 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.420438 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6062ebe-8887-4cb7-b11c-af631c1f9061-openshift-service-ca-bundle\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 15:04:59.420501 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:04:59.420462 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6062ebe-8887-4cb7-b11c-af631c1f9061-proxy-tls\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 15:05:00.184301 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:00.184228 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" Apr 22 15:05:00.184668 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:00.184221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7" event={"ID":"d6062ebe-8887-4cb7-b11c-af631c1f9061","Type":"ContainerDied","Data":"c15d545c32bc8873e91421b5b08126655e388acc51036fd757dadd80bd02d67a"} Apr 22 15:05:00.184668 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:00.184350 2579 scope.go:117] "RemoveContainer" containerID="4c2cf21919365ec1b96eb47adbfb5bd52eb19de5da0d7853a0010c880260be2c" Apr 22 15:05:00.232812 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:00.232784 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7"] Apr 22 15:05:00.241360 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:00.241335 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4e291-65ccfb5fb-lzpv7"] Apr 22 15:05:01.571250 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:01.571219 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-svgjx_83a28155-8693-412d-81cc-13f4d8f8cf4b/global-pull-secret-syncer/0.log" Apr 22 15:05:01.762347 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:01.762317 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" path="/var/lib/kubelet/pods/d6062ebe-8887-4cb7-b11c-af631c1f9061/volumes" Apr 22 15:05:01.774870 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:01.774846 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rc7xq_a8b2d183-dda4-4cb2-9822-b4d4ed73f8c4/konnectivity-agent/0.log" Apr 22 15:05:01.894393 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:01.894328 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-106.ec2.internal_daadd8592f265af4bb30938041dae753/haproxy/0.log" Apr 22 15:05:02.939526 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:02.939496 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nct5s/must-gather-flz5z"] Apr 22 15:05:02.939932 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:02.939702 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-nct5s/must-gather-flz5z" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="copy" containerID="cri-o://490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182" gracePeriod=2 Apr 22 15:05:02.944076 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:02.944050 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nct5s/must-gather-flz5z"] Apr 22 15:05:03.161610 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.161587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nct5s_must-gather-flz5z_e9f5ef2d-434a-48e8-a23d-e80c68579b68/copy/0.log" Apr 22 15:05:03.161917 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.161900 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:05:03.165169 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.165145 2579 status_manager.go:895] "Failed to get status for pod" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" pod="openshift-must-gather-nct5s/must-gather-flz5z" err="pods \"must-gather-flz5z\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-nct5s\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 22 15:05:03.194604 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.194553 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nct5s_must-gather-flz5z_e9f5ef2d-434a-48e8-a23d-e80c68579b68/copy/0.log" Apr 22 15:05:03.194911 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.194892 2579 generic.go:358] "Generic (PLEG): container finished" podID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerID="490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182" exitCode=143 Apr 22 15:05:03.194998 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.194932 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nct5s/must-gather-flz5z" Apr 22 15:05:03.195055 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.195006 2579 scope.go:117] "RemoveContainer" containerID="490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182" Apr 22 15:05:03.197711 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.197685 2579 status_manager.go:895] "Failed to get status for pod" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" pod="openshift-must-gather-nct5s/must-gather-flz5z" err="pods \"must-gather-flz5z\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-nct5s\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 22 15:05:03.201565 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.201549 2579 scope.go:117] "RemoveContainer" containerID="3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930" Apr 22 15:05:03.213048 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.212980 2579 scope.go:117] "RemoveContainer" containerID="490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182" Apr 22 15:05:03.213231 ip-10-0-137-106 kubenswrapper[2579]: E0422 15:05:03.213213 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182\": container with ID starting with 490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182 not found: ID does not exist" containerID="490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182" Apr 22 15:05:03.213286 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.213240 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182"} err="failed to get container status \"490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182\": rpc error: code = NotFound desc = could not find container \"490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182\": container with ID starting with 490f9722baf84b057f5db55e3216eac9e545fe8415ecc843906aa76f0e37e182 not found: ID does not exist" Apr 22 15:05:03.213286 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.213256 2579 scope.go:117] "RemoveContainer" containerID="3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930" Apr 22 15:05:03.213472 ip-10-0-137-106 kubenswrapper[2579]: E0422 15:05:03.213456 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930\": container with ID starting with 3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930 not found: ID does not exist" containerID="3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930" Apr 22 15:05:03.213510 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.213476 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930"} err="failed to get container status \"3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930\": rpc error: code = NotFound desc = could not find container \"3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930\": container with ID starting with 3040474a13d719460742a0cc54485019db910869542dcb532fff663a2c66d930 not found: ID does not exist" Apr 22 15:05:03.247749 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.247730 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rwzb\" (UniqueName: \"kubernetes.io/projected/e9f5ef2d-434a-48e8-a23d-e80c68579b68-kube-api-access-2rwzb\") pod \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " Apr 22 15:05:03.247849 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.247825 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9f5ef2d-434a-48e8-a23d-e80c68579b68-must-gather-output\") pod \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\" (UID: \"e9f5ef2d-434a-48e8-a23d-e80c68579b68\") " Apr 22 15:05:03.249159 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.249135 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f5ef2d-434a-48e8-a23d-e80c68579b68-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e9f5ef2d-434a-48e8-a23d-e80c68579b68" (UID: "e9f5ef2d-434a-48e8-a23d-e80c68579b68"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:05:03.249709 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.249685 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f5ef2d-434a-48e8-a23d-e80c68579b68-kube-api-access-2rwzb" (OuterVolumeSpecName: "kube-api-access-2rwzb") pod "e9f5ef2d-434a-48e8-a23d-e80c68579b68" (UID: "e9f5ef2d-434a-48e8-a23d-e80c68579b68"). InnerVolumeSpecName "kube-api-access-2rwzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:05:03.348981 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.348956 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9f5ef2d-434a-48e8-a23d-e80c68579b68-must-gather-output\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 15:05:03.348981 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.348976 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rwzb\" (UniqueName: \"kubernetes.io/projected/e9f5ef2d-434a-48e8-a23d-e80c68579b68-kube-api-access-2rwzb\") on node \"ip-10-0-137-106.ec2.internal\" DevicePath \"\"" Apr 22 15:05:03.505113 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.505086 2579 status_manager.go:895] "Failed to get status for pod" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" pod="openshift-must-gather-nct5s/must-gather-flz5z" err="pods \"must-gather-flz5z\" is forbidden: User \"system:node:ip-10-0-137-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-nct5s\": no relationship found between node 'ip-10-0-137-106.ec2.internal' and this object" Apr 22 15:05:03.764943 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:03.764869 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" path="/var/lib/kubelet/pods/e9f5ef2d-434a-48e8-a23d-e80c68579b68/volumes" Apr 22 15:05:05.195492 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.195438 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rr2n6_2043c022-20b9-4167-9f2b-b4197849da3b/monitoring-plugin/0.log" Apr 22 15:05:05.234506 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.234479 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hg6mm_51a5a4df-f45c-48f8-b0bb-8605d0d627be/node-exporter/0.log" Apr 22 15:05:05.258217 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.258192 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hg6mm_51a5a4df-f45c-48f8-b0bb-8605d0d627be/kube-rbac-proxy/0.log" Apr 22 15:05:05.283708 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.283682 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hg6mm_51a5a4df-f45c-48f8-b0bb-8605d0d627be/init-textfile/0.log" Apr 22 15:05:05.524377 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.524339 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8bwkd_59a394f8-6982-4782-9e92-b0377ea4a655/kube-rbac-proxy-main/0.log" Apr 22 15:05:05.554363 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.554336 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8bwkd_59a394f8-6982-4782-9e92-b0377ea4a655/kube-rbac-proxy-self/0.log" Apr 22 15:05:05.584036 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.584013 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8bwkd_59a394f8-6982-4782-9e92-b0377ea4a655/openshift-state-metrics/0.log" Apr 22 15:05:05.631091 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.631067 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/prometheus/0.log" Apr 22 15:05:05.648725 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.648705 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/config-reloader/0.log" Apr 22 15:05:05.677174 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.677150 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/thanos-sidecar/0.log" Apr 22 15:05:05.730303 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.730247 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/kube-rbac-proxy-web/0.log" Apr 22 15:05:05.809136 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.809116 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/kube-rbac-proxy/0.log" Apr 22 15:05:05.867915 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.867890 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/kube-rbac-proxy-thanos/0.log" Apr 22 15:05:05.903131 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:05.903108 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4d0fedaf-c65e-4389-b9e9-e4e607a788d7/init-config-reloader/0.log" Apr 22 15:05:06.007707 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.007665 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-jk48g_0cf5c90d-03ee-47c0-9928-ef11da808610/prometheus-operator-admission-webhook/0.log" Apr 22 15:05:06.042117 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.042093 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-867d7b45c-dq2gk_8db7bb95-61b6-4001-9976-00de970e2b49/telemeter-client/0.log" Apr 22 15:05:06.066858 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.066821 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-867d7b45c-dq2gk_8db7bb95-61b6-4001-9976-00de970e2b49/reload/0.log" Apr 22 15:05:06.097086 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.097062 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-867d7b45c-dq2gk_8db7bb95-61b6-4001-9976-00de970e2b49/kube-rbac-proxy/0.log" Apr 22 15:05:06.135190 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.135169 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5499b445bb-tln7b_3dcd7284-1f97-413e-9b26-7705617a2be5/thanos-query/0.log" Apr 22 15:05:06.165099 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.165076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5499b445bb-tln7b_3dcd7284-1f97-413e-9b26-7705617a2be5/kube-rbac-proxy-web/0.log" Apr 22 15:05:06.195341 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.195314 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5499b445bb-tln7b_3dcd7284-1f97-413e-9b26-7705617a2be5/kube-rbac-proxy/0.log" Apr 22 15:05:06.220547 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.220526 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5499b445bb-tln7b_3dcd7284-1f97-413e-9b26-7705617a2be5/prom-label-proxy/0.log" Apr 22 15:05:06.245927 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.245899 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5499b445bb-tln7b_3dcd7284-1f97-413e-9b26-7705617a2be5/kube-rbac-proxy-rules/0.log" Apr 22 15:05:06.272511 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:06.272442 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5499b445bb-tln7b_3dcd7284-1f97-413e-9b26-7705617a2be5/kube-rbac-proxy-metrics/0.log" Apr 22 15:05:08.132521 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132485 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66"] Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132919 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="gather" Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132937 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="gather" Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132957 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132965 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132978 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.132986 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" Apr 22 15:05:08.133006 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.133007 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="copy" Apr 22 15:05:08.133324 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.133015 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="copy" Apr 22 15:05:08.133324 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.133077 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="copy" Apr 22 15:05:08.133324 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.133091 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9f5ef2d-434a-48e8-a23d-e80c68579b68" containerName="gather" Apr 22 15:05:08.133324 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.133103 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6062ebe-8887-4cb7-b11c-af631c1f9061" containerName="switch-graph-4e291" Apr 22 15:05:08.133324 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.133114 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fc9f5d6-3ebf-4867-a668-9d844efd1d0f" containerName="kserve-container" Apr 22 15:05:08.136820 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.136799 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.145877 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.145557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zbnb6\"/\"openshift-service-ca.crt\"" Apr 22 15:05:08.146103 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.146081 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zbnb6\"/\"default-dockercfg-g9xjx\"" Apr 22 15:05:08.149848 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.146397 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zbnb6\"/\"kube-root-ca.crt\"" Apr 22 15:05:08.149848 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.149033 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66"] Apr 22 15:05:08.285064 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.285017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74fr\" (UniqueName: \"kubernetes.io/projected/cd8e5aae-72b2-458a-960b-6317b56b818c-kube-api-access-n74fr\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.285214 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.285083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-sys\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.285214 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.285117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-proc\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.285214 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.285144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-lib-modules\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.285214 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.285160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-podres\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385487 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-lib-modules\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385487 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-podres\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385487 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385464 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n74fr\" (UniqueName: \"kubernetes.io/projected/cd8e5aae-72b2-458a-960b-6317b56b818c-kube-api-access-n74fr\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385730 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-lib-modules\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385730 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-podres\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385730 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-sys\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385730 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385685 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-proc\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385918 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-sys\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.385918 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.385772 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cd8e5aae-72b2-458a-960b-6317b56b818c-proc\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.397199 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.397176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74fr\" (UniqueName: \"kubernetes.io/projected/cd8e5aae-72b2-458a-960b-6317b56b818c-kube-api-access-n74fr\") pod \"perf-node-gather-daemonset-k9q66\" (UID: \"cd8e5aae-72b2-458a-960b-6317b56b818c\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.447192 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.447170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:08.561043 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:08.561014 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66"] Apr 22 15:05:08.564083 ip-10-0-137-106 kubenswrapper[2579]: W0422 15:05:08.564047 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcd8e5aae_72b2_458a_960b_6317b56b818c.slice/crio-94b116b3333e7694060c6088211c928d043c73d54090b001b20752dfff3421f5 WatchSource:0}: Error finding container 94b116b3333e7694060c6088211c928d043c73d54090b001b20752dfff3421f5: Status 404 returned error can't find the container with id 94b116b3333e7694060c6088211c928d043c73d54090b001b20752dfff3421f5 Apr 22 15:05:09.213043 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.213007 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" event={"ID":"cd8e5aae-72b2-458a-960b-6317b56b818c","Type":"ContainerStarted","Data":"bc5255db9578fd392da8bbce0dc47aa059d8e477ed78d4df7beafb508ad3f29b"} Apr 22 15:05:09.213043 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.213039 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" event={"ID":"cd8e5aae-72b2-458a-960b-6317b56b818c","Type":"ContainerStarted","Data":"94b116b3333e7694060c6088211c928d043c73d54090b001b20752dfff3421f5"} Apr 22 15:05:09.213454 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.213142 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:09.234308 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.234252 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" podStartSLOduration=1.234233392 podStartE2EDuration="1.234233392s" podCreationTimestamp="2026-04-22 15:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:05:09.233580174 +0000 UTC m=+2960.036666143" watchObservedRunningTime="2026-04-22 15:05:09.234233392 +0000 UTC m=+2960.037319349" Apr 22 15:05:09.496912 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.496880 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kh5ld_98cf922e-f6c2-4a14-ad74-c33e7bf501df/dns/0.log" Apr 22 15:05:09.520889 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.520866 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kh5ld_98cf922e-f6c2-4a14-ad74-c33e7bf501df/kube-rbac-proxy/0.log" Apr 22 15:05:09.694207 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:09.694185 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v7x7t_05f8e253-60dd-45b7-a7a0-6117d9b61223/dns-node-resolver/0.log" Apr 22 15:05:10.268017 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:10.267982 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvnzr_eb5dd517-9720-4de6-8414-93da58e2c537/node-ca/0.log" Apr 22 15:05:11.633263 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:11.633226 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hjl8j_2ee495b6-b62e-473d-b4c4-23cc635944c3/serve-healthcheck-canary/0.log" Apr 22 15:05:12.127479 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:12.127451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7kbgh_d897aa63-fe04-4bf8-a7d6-57109b418895/kube-rbac-proxy/0.log" Apr 22 15:05:12.151571 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:12.151550 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7kbgh_d897aa63-fe04-4bf8-a7d6-57109b418895/exporter/0.log" Apr 22 15:05:12.185567 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:12.185536 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7kbgh_d897aa63-fe04-4bf8-a7d6-57109b418895/extractor/0.log" Apr 22 15:05:14.318447 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:14.318412 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-9t9rl_610dd975-29e1-440e-98a8-b8ad71b73274/manager/0.log" Apr 22 15:05:14.824630 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:14.824602 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-rq7bs_1ae6367c-74d7-4ec3-8923-c5d1947551d8/s3-init/0.log" Apr 22 15:05:15.224649 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:15.224583 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-k9q66" Apr 22 15:05:19.565870 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:19.565827 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v6j8t_97ec2540-b192-41b1-b9eb-09a5b0da42fd/migrator/0.log" Apr 22 15:05:19.592644 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:19.592620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v6j8t_97ec2540-b192-41b1-b9eb-09a5b0da42fd/graceful-termination/0.log" Apr 22 15:05:21.522565 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.522540 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/kube-multus-additional-cni-plugins/0.log" Apr 22 15:05:21.555235 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.555214 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/egress-router-binary-copy/0.log" Apr 22 15:05:21.586266 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.586246 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/cni-plugins/0.log" Apr 22 15:05:21.616852 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.616811 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/bond-cni-plugin/0.log" Apr 22 15:05:21.660177 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.660155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/routeoverride-cni/0.log" Apr 22 15:05:21.712555 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.712536 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/whereabouts-cni-bincopy/0.log" Apr 22 15:05:21.774361 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.774310 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zs8tx_2f37f53f-e8b0-4392-a64b-126947d423d8/whereabouts-cni/0.log" Apr 22 15:05:21.842225 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:21.842193 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f6c4x_173094ab-3108-4fc8-b54a-fd252e895dcb/kube-multus/0.log" Apr 22 15:05:22.002856 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:22.002808 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f984g_ce08d471-53a2-45d6-8243-b4455eddce85/network-metrics-daemon/0.log" Apr 22 15:05:22.030872 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:22.030789 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f984g_ce08d471-53a2-45d6-8243-b4455eddce85/kube-rbac-proxy/0.log" Apr 22 15:05:22.905203 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:22.905173 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/ovn-controller/0.log" Apr 22 15:05:22.958155 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:22.958124 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/ovn-acl-logging/0.log" Apr 22 15:05:22.979255 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:22.979232 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/kube-rbac-proxy-node/0.log" Apr 22 15:05:23.004817 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:23.004784 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:05:23.036744 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:23.036717 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/northd/0.log" Apr 22 15:05:23.061256 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:23.061234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/nbdb/0.log" Apr 22 15:05:23.089039 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:23.089010 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/sbdb/0.log" Apr 22 15:05:23.260847 ip-10-0-137-106 kubenswrapper[2579]: I0422 15:05:23.260807 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbkr7_5733e414-2e85-4ac0-ac62-62aace56bf89/ovnkube-controller/0.log"