Apr 16 13:56:49.029600 ip-10-0-128-129 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:49.029612 ip-10-0-128-129 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:49.029619 ip-10-0-128-129 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:49.029831 ip-10-0-128-129 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:57:00.644990 ip-10-0-128-129 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:57:00.645007 ip-10-0-128-129 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d0694bc160404e8fbef75989a443ed62 -- Apr 16 13:59:31.203594 ip-10-0-128-129 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:31.645383 ip-10-0-128-129 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:31.645383 ip-10-0-128-129 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:31.645383 ip-10-0-128-129 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:31.645383 ip-10-0-128-129 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:31.645383 ip-10-0-128-129 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:31.646195 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.646128 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:31.653445 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653421 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:31.653445 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653438 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:31.653445 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653442 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:31.653445 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653445 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:31.653445 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653448 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:31.653445 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653452 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653455 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653458 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653462 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653465 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653467 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653470 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653473 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653476 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653478 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653481 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653484 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653486 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653489 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653492 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653494 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653497 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653500 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653502 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653512 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:31.653646 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653515 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653518 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653520 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653523 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653525 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653527 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653530 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653533 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653535 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653538 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653540 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653543 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653545 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653548 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653551 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653554 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653556 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653559 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653562 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653564 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:31.654121 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653568 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653571 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653574 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653576 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653579 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653581 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653584 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653586 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653589 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653591 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653594 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653597 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653601 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653605 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653608 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653610 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653613 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653617 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653621 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:31.654710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653624 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653627 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653629 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653632 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653634 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653637 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653639 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653645 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653648 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653650 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653653 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653656 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653658 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653661 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653665 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653668 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653670 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653672 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653675 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653678 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:31.655163 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653680 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.653683 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654069 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654075 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654077 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654080 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654082 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654085 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654087 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654090 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654093 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654096 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654099 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654101 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654104 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654107 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654110 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654112 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654115 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654118 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:31.655671 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654121 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654123 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654126 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654129 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654132 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654134 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654138 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654141 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654143 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654146 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654148 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654150 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654153 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654155 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654158 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654160 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654162 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654165 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654167 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:31.656149 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654170 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654172 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654175 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654177 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654179 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654182 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654184 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654187 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654189 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654191 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654194 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654197 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654200 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654202 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654205 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654207 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654210 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654212 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654215 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:31.656632 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654217 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654221 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654224 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654227 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654230 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654232 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654234 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654237 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654239 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654242 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654244 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654247 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654249 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654251 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654256 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654259 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654262 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654265 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654268 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:31.657090 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654271 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654273 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654276 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654278 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654281 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654283 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654287 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654289 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654292 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654295 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.654298 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655093 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655101 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655111 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655115 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655120 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655123 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655128 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655132 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655135 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655138 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:31.657574 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655141 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655144 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655147 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655150 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655153 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655156 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655159 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655162 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655165 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655169 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655171 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655175 2572 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655178 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655181 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655185 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655188 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655191 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655196 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655199 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655202 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655206 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655209 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655212 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655216 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655219 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:31.658088 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655221 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655224 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655228 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655231 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655236 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655249 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655252 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655255 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655258 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655262 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655265 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655271 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655274 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655277 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655280 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655283 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655286 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655288 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655291 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655294 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655297 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655300 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655303 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655306 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655309 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:31.658719 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655312 2572 flags.go:64] FLAG: --help="false" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655315 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-128-129.ec2.internal" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655318 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655321 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655324 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655328 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655345 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655348 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655351 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655354 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655357 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655360 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655363 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655366 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655369 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655372 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655374 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655377 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655381 2572 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655384 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655387 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655390 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655395 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:31.659330 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655398 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655400 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655403 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655406 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655409 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655412 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655415 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655419 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655422 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655426 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655429 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655432 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655434 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655437 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655440 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655443 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655446 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655453 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655456 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655459 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655463 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655466 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655470 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655473 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:31.659904 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655476 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655479 2572 flags.go:64] FLAG: --port="10250" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655482 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655485 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01a0bf3cf02fe221b" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655489 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655492 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655494 2572 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655497 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655500 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655503 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655506 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655509 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655511 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655518 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655520 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655524 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655527 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655529 2572 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655532 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655535 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655538 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655541 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655543 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655546 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655549 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655552 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:31.660516 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655555 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655557 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655560 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655569 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655572 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655575 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655578 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655583 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655586 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655589 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655593 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655596 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655599 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655602 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655605 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655608 2572 flags.go:64] FLAG: --v="2" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655611 2572 flags.go:64] FLAG: --version="false" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655615 2572 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655619 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.655622 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655725 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655730 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655733 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655736 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:31.661134 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655739 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655742 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655744 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655747 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655750 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655752 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655755 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655758 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655761 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655763 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655766 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655769 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655772 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655775 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655779 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655782 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655784 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655786 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655790 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:31.661741 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655792 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655795 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655797 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655800 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655802 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655805 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655807 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655810 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655812 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655815 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655817 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655820 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655823 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655825 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655828 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655830 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655833 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655835 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655838 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:31.662213 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655840 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655843 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655845 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655848 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655850 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655853 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655855 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655858 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655862 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655866 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655870 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655872 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655876 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655879 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655881 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655883 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655887 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655889 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655892 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:31.662701 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655894 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655897 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655899 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655902 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655905 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655907 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655910 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655912 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655915 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655918 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655920 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655923 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655925 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655927 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655930 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655933 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655935 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655937 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655940 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655942 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:31.663168 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655945 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655951 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655954 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655957 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.655960 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.656833 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.663303 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.663317 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663384 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663389 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663392 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663395 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663399 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663402 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663404 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663407 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:31.663680 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663410 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663413 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663416 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663419 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663421 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663424 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663426 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663429 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663431 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663434 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663436 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663439 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663441 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663444 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663447 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663449 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663452 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663456 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663458 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663461 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:31.664065 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663463 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663466 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663468 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663471 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663474 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663476 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663479 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663481 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663484 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663486 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663489 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663491 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663494 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663498 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663502 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663505 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663507 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663510 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663512 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:31.664567 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663515 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663519 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663522 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663525 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663528 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663531 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663533 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663536 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663539 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663541 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663544 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663546 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663549 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663551 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663554 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663556 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663559 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663562 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663564 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:31.665054 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663567 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663570 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663572 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663575 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663577 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663580 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663582 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663585 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663588 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663590 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663593 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663595 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663597 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663600 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663602 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663605 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663607 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663610 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663612 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:31.665531 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663615 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.663620 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663722 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663727 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663729 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663733 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663736 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663739 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663741 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663744 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663746 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663750 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663752 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663755 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663757 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663759 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:31.665996 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663762 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663764 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663767 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663769 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663772 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663774 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663777 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663779 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663781 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663784 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663786 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663789 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663791 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663794 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663796 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663798 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663801 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663803 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663806 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663808 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:31.666404 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663810 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663813 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663815 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663818 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663820 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663823 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663825 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663828 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663830 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663833 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663836 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663838 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663841 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663843 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663845 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663848 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663851 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663854 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663857 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663860 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:31.666881 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663863 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663865 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663867 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663870 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663873 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663875 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663877 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663880 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663882 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663885 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663888 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663890 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663892 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663895 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663898 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663900 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663903 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663907 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663910 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:31.667387 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663914 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663917 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663919 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663922 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663925 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663928 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663930 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663933 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663935 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663938 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663940 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663943 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:31.663945 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.663950 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.664748 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:31.667873 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.666719 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:31.668299 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.667504 2572 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:31.668299 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.667599 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:31.668299 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.667634 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:31.693411 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.693391 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:31.696390 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.696305 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:31.714500 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.714483 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:31.720051 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.720036 2572 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:31.720294 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.720276 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:31.721885 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.721870 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:31.726375 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.726324 2572 fs.go:135] Filesystem UUIDs: map[06f875d4-3f65-408e-82b2-a5b52a479293:/dev/nvme0n1p3 60c360f4-0b16-4c0e-b43a-a8cc5c5a216b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:31.726437 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.726374 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:31.731908 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.731796 2572 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:31.729848433 +0000 UTC m=+0.408355323 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2500004 MemoryCapacity:32812154880 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fb82c3a52199c3aae3deb9acc042b SystemUUID:ec2fb82c-3a52-199c-3aae-3deb9acc042b BootID:d0694bc1-6040-4e8f-bef7-5989a443ed62 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406077440 Type:vfs Inodes:4005390 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562430976 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406077440 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:97:dc:42:d0:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:97:dc:42:d0:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:36:de:12:e8:10 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812154880 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:31.732482 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.732472 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:31.732550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.732539 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:31.733483 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.733463 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:31.733618 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.733486 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-129.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:31.733665 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.733627 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:31.733665 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.733636 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:31.733665 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.733648 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:31.734283 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.734273 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:31.735462 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.735451 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:31.735708 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.735698 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:31.738272 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.738263 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:31.738304 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.738276 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:31.738304 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.738287 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:31.738304 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.738297 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:31.738421 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.738306 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:31.739300 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.739289 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:31.739361 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.739307 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:31.741034 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.741018 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kxxmk" Apr 16 13:59:31.742102 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.742088 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:31.743786 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.743769 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:31.744963 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.744949 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.744968 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.744976 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.744985 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.744993 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745001 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745009 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745018 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745027 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:31.745036 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745036 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:31.745302 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745050 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:31.745302 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745063 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:31.745914 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745902 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:31.745962 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.745917 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:31.749568 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.749521 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kxxmk" Apr 16 13:59:31.750979 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.750959 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-129.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:31.751273 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.751255 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:31.751588 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.751308 2572 server.go:1295] "Started kubelet" Apr 16 13:59:31.751588 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.751412 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:31.751588 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.751499 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:31.751588 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.751575 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:31.752077 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.752008 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-129.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:31.752674 ip-10-0-128-129 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:31.752806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.752703 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:31.757136 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.757121 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:31.761703 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.761684 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:31.762363 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.762327 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.763163 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763179 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763195 2572 factory.go:55] Registering systemd factory Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763204 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763274 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763291 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763383 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:31.763416 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.763392 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:31.764182 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.764163 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:31.764275 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.764204 2572 factory.go:153] Registering CRI-O factory Apr 16 13:59:31.764275 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.764220 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:31.764275 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.764244 2572 factory.go:103] Registering Raw factory Apr 16 13:59:31.764275 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.764262 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:31.764776 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.764760 2572 manager.go:319] Starting recovery of all containers Apr 16 13:59:31.765481 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.765461 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:31.767045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.767028 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:31.768118 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.768077 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:31.777983 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.777862 2572 manager.go:324] Recovery completed Apr 16 13:59:31.778558 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.778534 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-129.ec2.internal\" not found" node="ip-10-0-128-129.ec2.internal" Apr 16 13:59:31.782697 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.782683 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:31.785092 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.785078 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:31.785165 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.785104 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:31.785165 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.785114 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:31.785568 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.785554 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:31.785631 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.785568 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:31.785631 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.785586 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:31.787873 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.787860 2572 policy_none.go:49] "None policy: Start" Apr 16 13:59:31.787947 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.787878 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:31.787947 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.787891 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:31.830876 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.830863 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:31.830992 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.830899 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:31.830992 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.830912 2572 server.go:85] "Starting device plugin registration server" Apr 16 13:59:31.831137 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.831125 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:31.831192 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.831138 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:31.831295 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.831264 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:31.831395 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.831376 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:31.831395 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.831393 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:31.831721 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.831708 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:31.831786 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.831742 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:31.896644 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.896596 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:31.897885 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.897871 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:31.897967 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.897896 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:31.897967 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.897918 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:31.897967 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.897927 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:31.897967 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.897963 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:31.900125 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.900107 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:31.932243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.932225 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:31.933089 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.933074 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:31.933089 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.933101 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:31.933233 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.933111 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:31.933233 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.933135 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-129.ec2.internal" Apr 16 13:59:31.938862 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.938849 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-129.ec2.internal" Apr 16 13:59:31.938936 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.938867 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-129.ec2.internal\": node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:31.955857 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:31.955839 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:31.998589 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.998564 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal"] Apr 16 13:59:31.998637 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.998623 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:31.999363 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.999330 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:31.999428 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.999375 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:31.999428 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:31.999389 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:32.000561 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.000549 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:32.000712 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.000698 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.000748 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.000733 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:32.001120 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.001108 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:32.001168 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.001126 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:32.001168 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.001133 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:32.001168 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.001138 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:32.001267 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.001155 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:32.001267 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.001219 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:32.002745 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.002732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.002799 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.002755 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:32.003328 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.003315 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:32.003406 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.003354 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:32.003406 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.003364 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:32.026817 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.026801 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-129.ec2.internal\" not found" node="ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.031072 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.031058 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-129.ec2.internal\" not found" node="ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.056583 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.056566 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.157423 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.157378 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.165723 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.165707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c13d13ed86434247999d294ebb97dea1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal\" (UID: \"c13d13ed86434247999d294ebb97dea1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.165781 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.165729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c13d13ed86434247999d294ebb97dea1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal\" (UID: \"c13d13ed86434247999d294ebb97dea1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.165781 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.165747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed2e5e70f0ff42ca6d01355865582d97-config\") pod \"kube-apiserver-proxy-ip-10-0-128-129.ec2.internal\" (UID: \"ed2e5e70f0ff42ca6d01355865582d97\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.258118 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.258091 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.266510 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.266494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c13d13ed86434247999d294ebb97dea1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal\" (UID: \"c13d13ed86434247999d294ebb97dea1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.266570 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.266516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed2e5e70f0ff42ca6d01355865582d97-config\") pod \"kube-apiserver-proxy-ip-10-0-128-129.ec2.internal\" (UID: \"ed2e5e70f0ff42ca6d01355865582d97\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.266570 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.266532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c13d13ed86434247999d294ebb97dea1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal\" (UID: \"c13d13ed86434247999d294ebb97dea1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.266570 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.266559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c13d13ed86434247999d294ebb97dea1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal\" (UID: \"c13d13ed86434247999d294ebb97dea1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.266666 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.266584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c13d13ed86434247999d294ebb97dea1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal\" (UID: \"c13d13ed86434247999d294ebb97dea1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.266666 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.266593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed2e5e70f0ff42ca6d01355865582d97-config\") pod \"kube-apiserver-proxy-ip-10-0-128-129.ec2.internal\" (UID: \"ed2e5e70f0ff42ca6d01355865582d97\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.330663 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.330633 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.333833 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.333814 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.358435 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.358402 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.458989 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.458941 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.559567 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.559547 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.660164 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.660140 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.668519 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.668505 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:32.668635 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.668619 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:32.668635 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.668626 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:32.668728 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.668643 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:32.751545 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.751488 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:31 +0000 UTC" deadline="2027-10-18 17:51:45.298532469 +0000 UTC" Apr 16 13:59:32.751545 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.751512 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13203h52m12.547023645s" Apr 16 13:59:32.760221 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.760207 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.762205 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.762190 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:32.774374 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.774355 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:32.791093 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.791077 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kj2mg" Apr 16 13:59:32.798522 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.798506 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kj2mg" Apr 16 13:59:32.860970 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:32.860954 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-129.ec2.internal\" not found" Apr 16 13:59:32.933638 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.933619 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:32.963305 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.963290 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.975044 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.975024 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:32.975145 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.975118 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" Apr 16 13:59:32.983236 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:32.983223 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:33.257089 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:33.257059 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13d13ed86434247999d294ebb97dea1.slice/crio-041b5b4476abdb94ce38717b73e32de66d80702c2d4b42e5c32ba7af6e8088d8 WatchSource:0}: Error finding container 041b5b4476abdb94ce38717b73e32de66d80702c2d4b42e5c32ba7af6e8088d8: Status 404 returned error can't find the container with id 041b5b4476abdb94ce38717b73e32de66d80702c2d4b42e5c32ba7af6e8088d8 Apr 16 13:59:33.261444 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.261429 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:33.567659 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:33.567635 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2e5e70f0ff42ca6d01355865582d97.slice/crio-adf0e42f96ca7ffe30da1efb7f4fa5cf6f435f07904f506de2cc824a40f795cb WatchSource:0}: Error finding container adf0e42f96ca7ffe30da1efb7f4fa5cf6f435f07904f506de2cc824a40f795cb: Status 404 returned error can't find the container with id adf0e42f96ca7ffe30da1efb7f4fa5cf6f435f07904f506de2cc824a40f795cb Apr 16 13:59:33.739173 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.739149 2572 apiserver.go:52] "Watching apiserver" Apr 16 13:59:33.746543 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.746514 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:33.746851 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.746829 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k","openshift-image-registry/node-ca-8nsr4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal","openshift-multus/multus-additional-cni-plugins-ghk8n","openshift-multus/network-metrics-daemon-7bhkl","openshift-network-operator/iptables-alerter-kx5cs","openshift-ovn-kubernetes/ovnkube-node-2vdts","kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vmp2w","openshift-dns/node-resolver-d7g97","openshift-multus/multus-x62b2","openshift-network-diagnostics/network-check-target-bh7db","kube-system/konnectivity-agent-srjct"] Apr 16 13:59:33.751312 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.751298 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.752436 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.752414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.752529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.752518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.753529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.753513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.753647 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.753628 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.753804 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.753727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7cvl4\"" Apr 16 13:59:33.753997 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.753984 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.754572 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.754555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:33.754642 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.754621 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:33.754802 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.754785 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:33.754979 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.754957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.755073 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755039 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:33.755203 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755182 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:33.755389 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755264 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8vhv7\"" Apr 16 13:59:33.755389 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755315 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-67l4s\"" Apr 16 13:59:33.755708 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.755928 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755908 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.756032 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.755936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:33.756162 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.756148 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:33.756324 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.756309 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:33.756483 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.756467 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.756526 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.756309 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.756749 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.756736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vsbf4\"" Apr 16 13:59:33.757828 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.757811 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.758001 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.757985 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:33.758061 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.758039 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nnzhc\"" Apr 16 13:59:33.758155 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.758138 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.758383 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.758368 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.758461 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.758416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.759706 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.759690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.760473 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.760453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:33.760628 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.760610 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2ffqx\"" Apr 16 13:59:33.760784 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.760770 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:33.760907 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.760893 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.763259 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.761970 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.763259 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.762627 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p46mk\"" Apr 16 13:59:33.763259 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.762648 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:33.763259 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.762690 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.763259 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.763259 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763073 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:33.763606 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763468 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:33.763606 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763503 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:33.763744 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-h5c4g\"" Apr 16 13:59:33.763833 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763798 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.763833 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.763813 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:33.764107 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.764080 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:33.764676 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.764656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nnffk\"" Apr 16 13:59:33.764753 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.764706 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:33.764800 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.764780 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:33.766619 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.766604 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:33.774964 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.774947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42f7372f-60f8-484f-bdc7-063aea09785d-cni-binary-copy\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775054 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.774970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-tuned\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.775054 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.774986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwgd\" (UniqueName: \"kubernetes.io/projected/ce2074e5-ffeb-4776-a271-517ad48e47e1-kube-api-access-6vwgd\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.775054 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-socket-dir-parent\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775054 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovn-node-metrics-cert\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-systemd-units\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-systemd\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2lf\" (UniqueName: \"kubernetes.io/projected/8e6892ba-2fcc-4246-87a3-cb11034c5167-kube-api-access-ms2lf\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-system-cni-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-modprobe-d\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05bbc014-12fb-4a97-900d-ab870f220e6f-agent-certs\") pod \"konnectivity-agent-srjct\" (UID: \"05bbc014-12fb-4a97-900d-ab870f220e6f\") " pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwvdq\" (UniqueName: \"kubernetes.io/projected/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-kube-api-access-rwvdq\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8nhh\" (UniqueName: \"kubernetes.io/projected/8e7d9455-87de-4016-8826-39fe981aa729-kube-api-access-c8nhh\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-kubelet\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-run\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.775262 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775251 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bl8x\" (UniqueName: \"kubernetes.io/projected/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-kube-api-access-7bl8x\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-kubelet\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-ovn\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e7d9455-87de-4016-8826-39fe981aa729-tmp-dir\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-os-release\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8e6892ba-2fcc-4246-87a3-cb11034c5167-iptables-alerter-script\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e6892ba-2fcc-4246-87a3-cb11034c5167-host-slash\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-hostroot\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-cni-bin\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-etc-kubernetes\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysconfig\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-sys\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-host\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-etc-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-node-log\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.775806 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-etc-selinux\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfpxx\" (UniqueName: \"kubernetes.io/projected/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-kube-api-access-lfpxx\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-netns\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-lib-modules\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-run-netns\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-log-socket\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775687 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-cni-bin\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-cnibin\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-multus-certs\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05bbc014-12fb-4a97-900d-ab870f220e6f-konnectivity-ca\") pod \"konnectivity-agent-srjct\" (UID: \"05bbc014-12fb-4a97-900d-ab870f220e6f\") " pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-socket-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42f7372f-60f8-484f-bdc7-063aea09785d-multus-daemon-config\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-host\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-serviceca\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovnkube-script-lib\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.776529 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovnkube-config\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4z8c\" (UniqueName: \"kubernetes.io/projected/59442fa9-d5a4-452c-bf14-f93d58af99dc-kube-api-access-q4z8c\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-systemd\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-slash\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-env-overrides\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.775993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-sys-fs\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-cni-multus\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-cni-binary-copy\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-cni-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-kubernetes\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysctl-d\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysctl-conf\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.777278 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-k8s-cni-cncf-io\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e7d9455-87de-4016-8826-39fe981aa729-hosts-file\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-var-lib-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-system-cni-dir\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-device-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-cni-netd\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-cnibin\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-conf-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnq6\" (UniqueName: \"kubernetes.io/projected/42f7372f-60f8-484f-bdc7-063aea09785d-kube-api-access-mwnq6\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce2074e5-ffeb-4776-a271-517ad48e47e1-tmp\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gnz\" (UniqueName: \"kubernetes.io/projected/98c854f8-33e5-46ea-aa35-7026190215b7-kube-api-access-q6gnz\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-registration-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-os-release\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.778045 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.776620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-var-lib-kubelet\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.800107 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.800086 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:32 +0000 UTC" deadline="2027-11-09 21:54:27.09804818 +0000 UTC" Apr 16 13:59:33.800208 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.800117 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13735h54m53.297943401s" Apr 16 13:59:33.877353 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-ovn\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.877444 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e7d9455-87de-4016-8826-39fe981aa729-tmp-dir\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.877444 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-os-release\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.877444 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877411 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.877444 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8e6892ba-2fcc-4246-87a3-cb11034c5167-iptables-alerter-script\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.877632 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e6892ba-2fcc-4246-87a3-cb11034c5167-host-slash\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.877632 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-hostroot\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.877632 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-hostroot\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.877632 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-ovn\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.877632 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-os-release\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e6892ba-2fcc-4246-87a3-cb11034c5167-host-slash\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-cni-bin\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-etc-kubernetes\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysconfig\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-etc-kubernetes\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-cni-bin\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysconfig\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.877854 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-sys\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-host\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-etc-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-sys\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-host\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-etc-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.877989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-node-log\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e7d9455-87de-4016-8826-39fe981aa729-tmp-dir\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-etc-selinux\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-node-log\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfpxx\" (UniqueName: \"kubernetes.io/projected/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-kube-api-access-lfpxx\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-netns\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-etc-selinux\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-lib-modules\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-netns\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-run-netns\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878210 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8e6892ba-2fcc-4246-87a3-cb11034c5167-iptables-alerter-script\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-run-netns\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-log-socket\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-cni-bin\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-cnibin\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-lib-modules\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-log-socket\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-cnibin\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-cni-bin\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-multus-certs\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-multus-certs\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05bbc014-12fb-4a97-900d-ab870f220e6f-konnectivity-ca\") pod \"konnectivity-agent-srjct\" (UID: \"05bbc014-12fb-4a97-900d-ab870f220e6f\") " pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-socket-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.878733 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42f7372f-60f8-484f-bdc7-063aea09785d-multus-daemon-config\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.879311 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.878744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-socket-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.879311 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-host\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.879311 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-serviceca\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.879311 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovnkube-script-lib\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879311 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovnkube-config\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879311 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-host\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4z8c\" (UniqueName: \"kubernetes.io/projected/59442fa9-d5a4-452c-bf14-f93d58af99dc-kube-api-access-q4z8c\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-systemd\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-slash\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879411 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-env-overrides\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-sys-fs\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-cni-multus\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05bbc014-12fb-4a97-900d-ab870f220e6f-konnectivity-ca\") pod \"konnectivity-agent-srjct\" (UID: \"05bbc014-12fb-4a97-900d-ab870f220e6f\") " pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-cni-multus\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-slash\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879554 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.879590 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-cni-binary-copy\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-cni-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-kubernetes\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysctl-d\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.879669 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42f7372f-60f8-484f-bdc7-063aea09785d-multus-daemon-config\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-cni-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.879758 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:34.379707166 +0000 UTC m=+3.058214055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysctl-conf\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-kubernetes\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-sys-fs\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysctl-conf\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-sysctl-d\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovnkube-config\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-env-overrides\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-k8s-cni-cncf-io\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.880243 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovnkube-script-lib\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-run-k8s-cni-cncf-io\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e7d9455-87de-4016-8826-39fe981aa729-hosts-file\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.879969 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-systemd\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e7d9455-87de-4016-8826-39fe981aa729-hosts-file\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-var-lib-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-system-cni-dir\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-var-lib-openvswitch\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98c854f8-33e5-46ea-aa35-7026190215b7-cni-binary-copy\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-device-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-cni-netd\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-cnibin\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-system-cni-dir\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881043 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-device-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-cnibin\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-cni-netd\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-serviceca\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-conf-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnq6\" (UniqueName: \"kubernetes.io/projected/42f7372f-60f8-484f-bdc7-063aea09785d-kube-api-access-mwnq6\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-conf-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce2074e5-ffeb-4776-a271-517ad48e47e1-tmp\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gnz\" (UniqueName: \"kubernetes.io/projected/98c854f8-33e5-46ea-aa35-7026190215b7-kube-api-access-q6gnz\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-registration-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-os-release\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-var-lib-kubelet\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42f7372f-60f8-484f-bdc7-063aea09785d-cni-binary-copy\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-registration-dir\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-os-release\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-tuned\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwgd\" (UniqueName: \"kubernetes.io/projected/ce2074e5-ffeb-4776-a271-517ad48e47e1-kube-api-access-6vwgd\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.881900 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880526 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-var-lib-kubelet\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-socket-dir-parent\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880561 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovn-node-metrics-cert\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-systemd-units\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-multus-socket-dir-parent\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-systemd\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2lf\" (UniqueName: \"kubernetes.io/projected/8e6892ba-2fcc-4246-87a3-cb11034c5167-kube-api-access-ms2lf\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-systemd-units\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-system-cni-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-modprobe-d\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-system-cni-dir\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-run-systemd\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05bbc014-12fb-4a97-900d-ab870f220e6f-agent-certs\") pod \"konnectivity-agent-srjct\" (UID: \"05bbc014-12fb-4a97-900d-ab870f220e6f\") " pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwvdq\" (UniqueName: \"kubernetes.io/projected/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-kube-api-access-rwvdq\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8nhh\" (UniqueName: \"kubernetes.io/projected/8e7d9455-87de-4016-8826-39fe981aa729-kube-api-access-c8nhh\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-kubelet\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880857 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-run\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.882738 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bl8x\" (UniqueName: \"kubernetes.io/projected/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-kube-api-access-7bl8x\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-kubelet\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42f7372f-60f8-484f-bdc7-063aea09785d-host-var-lib-kubelet\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.881016 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-run\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.881113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-host-kubelet\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.880932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42f7372f-60f8-484f-bdc7-063aea09785d-cni-binary-copy\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.881197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-modprobe-d\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.883550 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.881297 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98c854f8-33e5-46ea-aa35-7026190215b7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.883740 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.883564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-ovn-node-metrics-cert\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.883740 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.883644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce2074e5-ffeb-4776-a271-517ad48e47e1-tmp\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.883740 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.883667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce2074e5-ffeb-4776-a271-517ad48e47e1-etc-tuned\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.883890 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.883874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05bbc014-12fb-4a97-900d-ab870f220e6f-agent-certs\") pod \"konnectivity-agent-srjct\" (UID: \"05bbc014-12fb-4a97-900d-ab870f220e6f\") " pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:33.889864 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.889846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfpxx\" (UniqueName: \"kubernetes.io/projected/c8f7ebb9-7806-4373-82d7-d8b4b85bc435-kube-api-access-lfpxx\") pod \"aws-ebs-csi-driver-node-pj26k\" (UID: \"c8f7ebb9-7806-4373-82d7-d8b4b85bc435\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:33.893635 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.893613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gnz\" (UniqueName: \"kubernetes.io/projected/98c854f8-33e5-46ea-aa35-7026190215b7-kube-api-access-q6gnz\") pod \"multus-additional-cni-plugins-ghk8n\" (UID: \"98c854f8-33e5-46ea-aa35-7026190215b7\") " pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:33.894729 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.894710 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:33.894826 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.894733 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:33.894826 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.894748 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:33.894826 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:33.894801 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:34.394785067 +0000 UTC m=+3.073291954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:33.896287 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.896242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwgd\" (UniqueName: \"kubernetes.io/projected/ce2074e5-ffeb-4776-a271-517ad48e47e1-kube-api-access-6vwgd\") pod \"tuned-vmp2w\" (UID: \"ce2074e5-ffeb-4776-a271-517ad48e47e1\") " pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:33.896496 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.896479 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bl8x\" (UniqueName: \"kubernetes.io/projected/6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660-kube-api-access-7bl8x\") pod \"node-ca-8nsr4\" (UID: \"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660\") " pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:33.896496 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.896487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwvdq\" (UniqueName: \"kubernetes.io/projected/bdeff3e6-46e4-45e1-a8f2-7934598cbfbd-kube-api-access-rwvdq\") pod \"ovnkube-node-2vdts\" (UID: \"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:33.897239 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.897222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnq6\" (UniqueName: \"kubernetes.io/projected/42f7372f-60f8-484f-bdc7-063aea09785d-kube-api-access-mwnq6\") pod \"multus-x62b2\" (UID: \"42f7372f-60f8-484f-bdc7-063aea09785d\") " pod="openshift-multus/multus-x62b2" Apr 16 13:59:33.897324 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.897310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8nhh\" (UniqueName: \"kubernetes.io/projected/8e7d9455-87de-4016-8826-39fe981aa729-kube-api-access-c8nhh\") pod \"node-resolver-d7g97\" (UID: \"8e7d9455-87de-4016-8826-39fe981aa729\") " pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:33.897668 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.897647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2lf\" (UniqueName: \"kubernetes.io/projected/8e6892ba-2fcc-4246-87a3-cb11034c5167-kube-api-access-ms2lf\") pod \"iptables-alerter-kx5cs\" (UID: \"8e6892ba-2fcc-4246-87a3-cb11034c5167\") " pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:33.899112 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.899094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4z8c\" (UniqueName: \"kubernetes.io/projected/59442fa9-d5a4-452c-bf14-f93d58af99dc-kube-api-access-q4z8c\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:33.901072 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.901040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" event={"ID":"ed2e5e70f0ff42ca6d01355865582d97","Type":"ContainerStarted","Data":"adf0e42f96ca7ffe30da1efb7f4fa5cf6f435f07904f506de2cc824a40f795cb"} Apr 16 13:59:33.901969 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.901952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" event={"ID":"c13d13ed86434247999d294ebb97dea1","Type":"ContainerStarted","Data":"041b5b4476abdb94ce38717b73e32de66d80702c2d4b42e5c32ba7af6e8088d8"} Apr 16 13:59:33.960023 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:33.960003 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:34.049616 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.049578 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:34.065538 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.065508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" Apr 16 13:59:34.069220 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.069203 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8nsr4" Apr 16 13:59:34.072780 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.072691 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce2074e5_ffeb_4776_a271_517ad48e47e1.slice/crio-7c79bed90db301d81f1209462e36827285171267d65b22074c8aaeea32df6f27 WatchSource:0}: Error finding container 7c79bed90db301d81f1209462e36827285171267d65b22074c8aaeea32df6f27: Status 404 returned error can't find the container with id 7c79bed90db301d81f1209462e36827285171267d65b22074c8aaeea32df6f27 Apr 16 13:59:34.074910 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.074888 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:34.080145 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.080087 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3e4196_7c1a_4fd5_b7a3_aac08e8eb660.slice/crio-d8043ff23d7c0d60b24b736286eb6794ccc82479b2567620d825c564ae0df322 WatchSource:0}: Error finding container d8043ff23d7c0d60b24b736286eb6794ccc82479b2567620d825c564ae0df322: Status 404 returned error can't find the container with id d8043ff23d7c0d60b24b736286eb6794ccc82479b2567620d825c564ae0df322 Apr 16 13:59:34.081086 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.081067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" Apr 16 13:59:34.086903 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.085872 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kx5cs" Apr 16 13:59:34.090909 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.090884 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c854f8_33e5_46ea_aa35_7026190215b7.slice/crio-1affdb2e27f42f19bc1c0c7ac321c28bb7e1d3f8735088f6c1cf03e08e3bf320 WatchSource:0}: Error finding container 1affdb2e27f42f19bc1c0c7ac321c28bb7e1d3f8735088f6c1cf03e08e3bf320: Status 404 returned error can't find the container with id 1affdb2e27f42f19bc1c0c7ac321c28bb7e1d3f8735088f6c1cf03e08e3bf320 Apr 16 13:59:34.090909 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.090901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:34.094016 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.093985 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6892ba_2fcc_4246_87a3_cb11034c5167.slice/crio-9be25f6fef8c951d9d3dc29ed9e410f3db01e80bdbd0664034719d867e0c3967 WatchSource:0}: Error finding container 9be25f6fef8c951d9d3dc29ed9e410f3db01e80bdbd0664034719d867e0c3967: Status 404 returned error can't find the container with id 9be25f6fef8c951d9d3dc29ed9e410f3db01e80bdbd0664034719d867e0c3967 Apr 16 13:59:34.096310 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.096292 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" Apr 16 13:59:34.098499 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.098473 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdeff3e6_46e4_45e1_a8f2_7934598cbfbd.slice/crio-8ca8d2adc0d61560bdfc0119da8e8ce82ed1e45b69d28c700b06b21b625cf46d WatchSource:0}: Error finding container 8ca8d2adc0d61560bdfc0119da8e8ce82ed1e45b69d28c700b06b21b625cf46d: Status 404 returned error can't find the container with id 8ca8d2adc0d61560bdfc0119da8e8ce82ed1e45b69d28c700b06b21b625cf46d Apr 16 13:59:34.101767 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.101756 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d7g97" Apr 16 13:59:34.106478 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.106462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x62b2" Apr 16 13:59:34.107494 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.106863 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f7ebb9_7806_4373_82d7_d8b4b85bc435.slice/crio-44202825272a290abc76e2058baff3d08a185d08caf4dc4ba9f14d88f5400c2b WatchSource:0}: Error finding container 44202825272a290abc76e2058baff3d08a185d08caf4dc4ba9f14d88f5400c2b: Status 404 returned error can't find the container with id 44202825272a290abc76e2058baff3d08a185d08caf4dc4ba9f14d88f5400c2b Apr 16 13:59:34.110539 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.110515 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7d9455_87de_4016_8826_39fe981aa729.slice/crio-9b8d1daf7aefa4df5c1218358f9cd2b6c57da92d071bb78a6736d483c33aad8b WatchSource:0}: Error finding container 9b8d1daf7aefa4df5c1218358f9cd2b6c57da92d071bb78a6736d483c33aad8b: Status 404 returned error can't find the container with id 9b8d1daf7aefa4df5c1218358f9cd2b6c57da92d071bb78a6736d483c33aad8b Apr 16 13:59:34.111563 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.111541 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:34.117710 ip-10-0-128-129 kubenswrapper[2572]: W0416 13:59:34.117679 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f7372f_60f8_484f_bdc7_063aea09785d.slice/crio-c72589fba68857ed044a4dd6bd66c8c0a15b3f3e9486a0d2ff2358c0f070383f WatchSource:0}: Error finding container c72589fba68857ed044a4dd6bd66c8c0a15b3f3e9486a0d2ff2358c0f070383f: Status 404 returned error can't find the container with id c72589fba68857ed044a4dd6bd66c8c0a15b3f3e9486a0d2ff2358c0f070383f Apr 16 13:59:34.383602 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.383574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:34.383787 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:34.383716 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:34.383787 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:34.383776 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:35.383761344 +0000 UTC m=+4.062268210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:34.484316 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.484284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:34.484484 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:34.484462 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:34.484484 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:34.484482 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:34.484590 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:34.484495 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:34.484590 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:34.484557 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:35.484538077 +0000 UTC m=+4.163044959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:34.801069 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.801026 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:32 +0000 UTC" deadline="2027-12-29 02:57:42.7328695 +0000 UTC" Apr 16 13:59:34.801069 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.801062 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14916h58m7.931811051s" Apr 16 13:59:34.906196 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.906162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x62b2" event={"ID":"42f7372f-60f8-484f-bdc7-063aea09785d","Type":"ContainerStarted","Data":"c72589fba68857ed044a4dd6bd66c8c0a15b3f3e9486a0d2ff2358c0f070383f"} Apr 16 13:59:34.907517 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.907490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d7g97" event={"ID":"8e7d9455-87de-4016-8826-39fe981aa729","Type":"ContainerStarted","Data":"9b8d1daf7aefa4df5c1218358f9cd2b6c57da92d071bb78a6736d483c33aad8b"} Apr 16 13:59:34.909545 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.909511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"8ca8d2adc0d61560bdfc0119da8e8ce82ed1e45b69d28c700b06b21b625cf46d"} Apr 16 13:59:34.912104 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.912075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-srjct" event={"ID":"05bbc014-12fb-4a97-900d-ab870f220e6f","Type":"ContainerStarted","Data":"5bc136871485c40d4b7a06395df6cdaeb1a05596e95b30adb180a6ce451cb602"} Apr 16 13:59:34.914810 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.914781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8nsr4" event={"ID":"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660","Type":"ContainerStarted","Data":"d8043ff23d7c0d60b24b736286eb6794ccc82479b2567620d825c564ae0df322"} Apr 16 13:59:34.917225 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.917202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" event={"ID":"ce2074e5-ffeb-4776-a271-517ad48e47e1","Type":"ContainerStarted","Data":"7c79bed90db301d81f1209462e36827285171267d65b22074c8aaeea32df6f27"} Apr 16 13:59:34.918660 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.918634 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" event={"ID":"c8f7ebb9-7806-4373-82d7-d8b4b85bc435","Type":"ContainerStarted","Data":"44202825272a290abc76e2058baff3d08a185d08caf4dc4ba9f14d88f5400c2b"} Apr 16 13:59:34.919728 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.919696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kx5cs" event={"ID":"8e6892ba-2fcc-4246-87a3-cb11034c5167","Type":"ContainerStarted","Data":"9be25f6fef8c951d9d3dc29ed9e410f3db01e80bdbd0664034719d867e0c3967"} Apr 16 13:59:34.921352 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:34.921309 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerStarted","Data":"1affdb2e27f42f19bc1c0c7ac321c28bb7e1d3f8735088f6c1cf03e08e3bf320"} Apr 16 13:59:35.392136 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:35.392102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:35.392321 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.392261 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:35.392415 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.392349 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.392309573 +0000 UTC m=+6.070816456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:35.493517 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:35.492915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:35.493517 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.493082 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:35.493517 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.493100 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:35.493517 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.493113 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:35.493517 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.493178 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.493158539 +0000 UTC m=+6.171665406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:35.898690 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:35.898637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:35.899095 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.898828 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:35.899271 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:35.899254 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:35.899385 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:35.899364 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:36.948640 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:36.946585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" event={"ID":"c13d13ed86434247999d294ebb97dea1","Type":"ContainerStarted","Data":"440f0ce22d48983a75ee5dd1949ce775344acdf3a38664a4aea16d97d80efedd"} Apr 16 13:59:37.412586 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:37.412505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:37.412758 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.412650 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:37.412758 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.412715 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.412694045 +0000 UTC m=+10.091200917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:37.513673 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:37.513639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:37.513816 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.513806 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:37.513854 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.513824 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:37.513854 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.513836 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:37.513909 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.513892 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.513872874 +0000 UTC m=+10.192379741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:37.900985 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:37.900915 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:37.901143 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.901040 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:37.901471 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:37.901448 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:37.901612 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:37.901563 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:39.899176 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:39.899137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:39.899606 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:39.899269 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:39.899659 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:39.899137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:39.899758 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:39.899737 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:39.954609 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:39.954573 2572 generic.go:358] "Generic (PLEG): container finished" podID="c13d13ed86434247999d294ebb97dea1" containerID="440f0ce22d48983a75ee5dd1949ce775344acdf3a38664a4aea16d97d80efedd" exitCode=0 Apr 16 13:59:39.954753 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:39.954619 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" event={"ID":"c13d13ed86434247999d294ebb97dea1","Type":"ContainerDied","Data":"440f0ce22d48983a75ee5dd1949ce775344acdf3a38664a4aea16d97d80efedd"} Apr 16 13:59:41.444520 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:41.444484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:41.444930 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.444619 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:41.444930 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.444686 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 13:59:49.444665882 +0000 UTC m=+18.123172752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:41.545178 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:41.545147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:41.545367 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.545318 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:41.545367 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.545356 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:41.545487 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.545369 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.545487 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.545422 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:49.545403423 +0000 UTC m=+18.223910303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.899557 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:41.899524 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:41.899721 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.899646 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:41.900011 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:41.899994 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:41.900106 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:41.900086 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:43.899167 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:43.899128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:43.899597 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:43.899274 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:43.899597 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:43.899583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:43.899687 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:43.899663 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:45.898266 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:45.898233 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:45.898690 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:45.898240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:45.898690 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:45.898389 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:45.898690 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:45.898454 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:47.898878 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:47.898852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:47.899287 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:47.898946 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:47.899287 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:47.899020 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:47.899287 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:47.899103 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:49.506569 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:49.506533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:49.506991 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.506679 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:49.506991 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.506749 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:05.506730962 +0000 UTC m=+34.185237842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:49.607405 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:49.607371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:49.607559 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.607519 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:49.607559 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.607537 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:49.607559 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.607546 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:49.607702 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.607597 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:05.607578979 +0000 UTC m=+34.286085859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:49.898981 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:49.898905 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:49.899129 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.899017 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:49.899129 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:49.899087 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:49.899262 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:49.899198 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:51.899527 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:51.899488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:51.899959 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:51.899593 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:51.900032 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:51.899995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:51.900117 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:51.900091 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:52.975740 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.975536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x62b2" event={"ID":"42f7372f-60f8-484f-bdc7-063aea09785d","Type":"ContainerStarted","Data":"ed5fdeee8c7ff12cab24d102d4a3930d984faeaa18a60971ce6ad924a03a1ad1"} Apr 16 13:59:52.976950 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.976911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d7g97" event={"ID":"8e7d9455-87de-4016-8826-39fe981aa729","Type":"ContainerStarted","Data":"5225be1afd86849001b1cd98a7e8a46d7823e799016b005ade27b4ff8f87db49"} Apr 16 13:59:52.978562 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.978536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"5b38cd4ec0919124611df5ae4a9f3129022bc24505cfb0b1228c320e88199d16"} Apr 16 13:59:52.979853 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.979743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-srjct" event={"ID":"05bbc014-12fb-4a97-900d-ab870f220e6f","Type":"ContainerStarted","Data":"89bc3370b4d1d52c63626f7eda5d7997596d6de57baa68331e16eeaf829e7b5f"} Apr 16 13:59:52.981599 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.981573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8nsr4" event={"ID":"6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660","Type":"ContainerStarted","Data":"7228f1d234bc1d183caee8834f17043a011c36f207736b4883659f98d8fb9cbb"} Apr 16 13:59:52.983147 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.983120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" event={"ID":"ce2074e5-ffeb-4776-a271-517ad48e47e1","Type":"ContainerStarted","Data":"46bc1ed6ae99ac67ea530f27dfe7914f78cc84baa35b2e36ef01ad6b53b684c0"} Apr 16 13:59:52.985322 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.985300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" event={"ID":"ed2e5e70f0ff42ca6d01355865582d97","Type":"ContainerStarted","Data":"08c4e03fb894f6d22812d6b6a31816b7a1c9b67fedd201df9b14ce71449b0dc4"} Apr 16 13:59:52.987152 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.987128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" event={"ID":"c13d13ed86434247999d294ebb97dea1","Type":"ContainerStarted","Data":"b5c3a12f557dfdf61a042a1c9132fbdfa2bc490a561cd5c304edae2da00ed36d"} Apr 16 13:59:52.990476 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.990446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" event={"ID":"c8f7ebb9-7806-4373-82d7-d8b4b85bc435","Type":"ContainerStarted","Data":"9b884a19b4099439201a299e9ceb404962c8eabf9eff38fc352af48132d14951"} Apr 16 13:59:52.991807 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.991786 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerStarted","Data":"aa28fa97e770b26464232cb7382031e453f07e6c948953aea0f5bf3fee84a374"} Apr 16 13:59:52.994051 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:52.994010 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x62b2" podStartSLOduration=2.618417284 podStartE2EDuration="20.993998708s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.120893815 +0000 UTC m=+2.799400697" lastFinishedPulling="2026-04-16 13:59:52.496475241 +0000 UTC m=+21.174982121" observedRunningTime="2026-04-16 13:59:52.993548351 +0000 UTC m=+21.672055240" watchObservedRunningTime="2026-04-16 13:59:52.993998708 +0000 UTC m=+21.672505597" Apr 16 13:59:53.008226 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.008190 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-129.ec2.internal" podStartSLOduration=21.008178574 podStartE2EDuration="21.008178574s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:53.008079369 +0000 UTC m=+21.686586259" watchObservedRunningTime="2026-04-16 13:59:53.008178574 +0000 UTC m=+21.686685462" Apr 16 13:59:53.021432 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.021265 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-129.ec2.internal" podStartSLOduration=21.021253617 podStartE2EDuration="21.021253617s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:53.020921622 +0000 UTC m=+21.699428523" watchObservedRunningTime="2026-04-16 13:59:53.021253617 +0000 UTC m=+21.699760502" Apr 16 13:59:53.038291 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.038161 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vmp2w" podStartSLOduration=3.637145525 podStartE2EDuration="22.038149096s" podCreationTimestamp="2026-04-16 13:59:31 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.074783337 +0000 UTC m=+2.753290202" lastFinishedPulling="2026-04-16 13:59:52.475786905 +0000 UTC m=+21.154293773" observedRunningTime="2026-04-16 13:59:53.037772525 +0000 UTC m=+21.716279418" watchObservedRunningTime="2026-04-16 13:59:53.038149096 +0000 UTC m=+21.716656010" Apr 16 13:59:53.052394 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.052123 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8nsr4" podStartSLOduration=11.703079089 podStartE2EDuration="22.05210739s" podCreationTimestamp="2026-04-16 13:59:31 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.082419444 +0000 UTC m=+2.760926323" lastFinishedPulling="2026-04-16 13:59:44.431447752 +0000 UTC m=+13.109954624" observedRunningTime="2026-04-16 13:59:53.052005231 +0000 UTC m=+21.730512120" watchObservedRunningTime="2026-04-16 13:59:53.05210739 +0000 UTC m=+21.730614279" Apr 16 13:59:53.088489 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.088452 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-srjct" podStartSLOduration=3.7321586939999998 podStartE2EDuration="22.088432635s" podCreationTimestamp="2026-04-16 13:59:31 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.084759463 +0000 UTC m=+2.763266340" lastFinishedPulling="2026-04-16 13:59:52.441033402 +0000 UTC m=+21.119540281" observedRunningTime="2026-04-16 13:59:53.087955283 +0000 UTC m=+21.766462165" watchObservedRunningTime="2026-04-16 13:59:53.088432635 +0000 UTC m=+21.766939524" Apr 16 13:59:53.101170 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.101119 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d7g97" podStartSLOduration=2.773087667 podStartE2EDuration="21.101100752s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.112984151 +0000 UTC m=+2.791491020" lastFinishedPulling="2026-04-16 13:59:52.440997225 +0000 UTC m=+21.119504105" observedRunningTime="2026-04-16 13:59:53.100453675 +0000 UTC m=+21.778960562" watchObservedRunningTime="2026-04-16 13:59:53.101100752 +0000 UTC m=+21.779607641" Apr 16 13:59:53.317599 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.317576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:53.318245 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.318228 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:53.899121 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.899090 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:53.899318 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:53.899202 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:53.899318 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.899090 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:53.899462 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:53.899394 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:53.996157 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.996121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"c03bc6976e98b82a0734080cd0671ef3aa774c511ad44e13c43d1928b744e0e9"} Apr 16 13:59:53.996157 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.996158 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"c047669ed66672eb9f3663f214dc31ae7714011e2e645ab586d6be2675396abf"} Apr 16 13:59:53.996607 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.996170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"3b219f532ba3860dc3dc9a5d70cc0866bc022317088029b1900863271300523e"} Apr 16 13:59:53.996607 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.996179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"859f2034a83b15ca2ca8915c860647bec35bfce09cedc232de3baf6abc1ada25"} Apr 16 13:59:53.996607 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.996187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"6a8aa21c07ea47a0fb4f95a2dc85471a9a26de007b0efae908d9570c3e23e9f5"} Apr 16 13:59:53.997375 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.997326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kx5cs" event={"ID":"8e6892ba-2fcc-4246-87a3-cb11034c5167","Type":"ContainerStarted","Data":"c3e1748a91d65ec92b88a5919ca324c280323ce4bf6caffe38f671b7ba454320"} Apr 16 13:59:53.998707 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.998682 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c854f8-33e5-46ea-aa35-7026190215b7" containerID="aa28fa97e770b26464232cb7382031e453f07e6c948953aea0f5bf3fee84a374" exitCode=0 Apr 16 13:59:53.998846 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.998825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerDied","Data":"aa28fa97e770b26464232cb7382031e453f07e6c948953aea0f5bf3fee84a374"} Apr 16 13:59:53.999848 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.999697 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:53.999848 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:53.999741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-srjct" Apr 16 13:59:54.028621 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:54.028585 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kx5cs" podStartSLOduration=4.683627667 podStartE2EDuration="23.028573451s" podCreationTimestamp="2026-04-16 13:59:31 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.09562829 +0000 UTC m=+2.774135157" lastFinishedPulling="2026-04-16 13:59:52.440574058 +0000 UTC m=+21.119080941" observedRunningTime="2026-04-16 13:59:54.014231945 +0000 UTC m=+22.692738824" watchObservedRunningTime="2026-04-16 13:59:54.028573451 +0000 UTC m=+22.707080338" Apr 16 13:59:54.398110 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:54.398076 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:54.842446 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:54.842345 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:54.398106986Z","UUID":"5a8292d9-6bac-4204-bf83-e8a53842d37c","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:54.844905 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:54.844885 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:54.844905 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:54.844910 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:55.002387 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:55.002354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" event={"ID":"c8f7ebb9-7806-4373-82d7-d8b4b85bc435","Type":"ContainerStarted","Data":"8a6affb1478dfa3ae7e15232cf29b6b08f2c1ff23a3b6d4b74d3284ef1cd8a30"} Apr 16 13:59:55.899287 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:55.899049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:55.899463 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:55.899055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:55.899463 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:55.899412 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:55.899545 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:55.899486 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:56.007109 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:56.007060 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"9061f11d6468965d5f44e1c74883fa41238f4bb360f6abcffbe61677917c00bb"} Apr 16 13:59:57.010954 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:57.010915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" event={"ID":"c8f7ebb9-7806-4373-82d7-d8b4b85bc435","Type":"ContainerStarted","Data":"d3c8886b4f0bbd9de469adc70984cf714fa39e5605283e00383d0d31f17786a4"} Apr 16 13:59:57.032553 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:57.030735 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pj26k" podStartSLOduration=3.064206077 podStartE2EDuration="25.03071761s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.108768804 +0000 UTC m=+2.787275671" lastFinishedPulling="2026-04-16 13:59:56.075280338 +0000 UTC m=+24.753787204" observedRunningTime="2026-04-16 13:59:57.02941969 +0000 UTC m=+25.707926581" watchObservedRunningTime="2026-04-16 13:59:57.03071761 +0000 UTC m=+25.709224500" Apr 16 13:59:57.898661 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:57.898590 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:57.898661 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:57.898625 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:57.898885 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:57.898699 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 13:59:57.898885 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:57.898844 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:59.016553 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.016322 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c854f8-33e5-46ea-aa35-7026190215b7" containerID="45d0538a583a2e39f610064ebcb78752f76435640fa126364d92cb9c32b11b72" exitCode=0 Apr 16 13:59:59.017109 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.016367 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerDied","Data":"45d0538a583a2e39f610064ebcb78752f76435640fa126364d92cb9c32b11b72"} Apr 16 13:59:59.020893 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.020861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" event={"ID":"bdeff3e6-46e4-45e1-a8f2-7934598cbfbd","Type":"ContainerStarted","Data":"f3f446c1b22301cd438962ad4d92148e3a943404df70e9d09737a3d659f380d7"} Apr 16 13:59:59.021248 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.021223 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:59.021248 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.021252 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:59.021456 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.021267 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:59.037561 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.037534 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:59.037736 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.037718 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 13:59:59.063260 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.063212 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" podStartSLOduration=8.309510826 podStartE2EDuration="27.063200545s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.100329286 +0000 UTC m=+2.778836151" lastFinishedPulling="2026-04-16 13:59:52.854019001 +0000 UTC m=+21.532525870" observedRunningTime="2026-04-16 13:59:59.062254468 +0000 UTC m=+27.740761356" watchObservedRunningTime="2026-04-16 13:59:59.063200545 +0000 UTC m=+27.741707433" Apr 16 13:59:59.899002 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.898974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 13:59:59.899158 ip-10-0-128-129 kubenswrapper[2572]: I0416 13:59:59.899007 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 13:59:59.899158 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:59.899094 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 13:59:59.899234 ip-10-0-128-129 kubenswrapper[2572]: E0416 13:59:59.899210 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:01.026555 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:01.026519 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c854f8-33e5-46ea-aa35-7026190215b7" containerID="1ba2818e4c73616a0cb0af11a14de0320f69f62bd8449b55ec34462a82596ba2" exitCode=0 Apr 16 14:00:01.026984 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:01.026601 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerDied","Data":"1ba2818e4c73616a0cb0af11a14de0320f69f62bd8449b55ec34462a82596ba2"} Apr 16 14:00:01.898891 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:01.898862 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:01.899047 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:01.898937 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:01.899047 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:01.899030 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:01.899172 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:01.899150 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:03.032313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:03.032281 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c854f8-33e5-46ea-aa35-7026190215b7" containerID="8b08f96a3da7724f75203f76d66e521b0826b170cdcb1bb51cb6d1d206df1601" exitCode=0 Apr 16 14:00:03.032703 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:03.032371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerDied","Data":"8b08f96a3da7724f75203f76d66e521b0826b170cdcb1bb51cb6d1d206df1601"} Apr 16 14:00:03.898158 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:03.898123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:03.898354 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:03.898135 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:03.898354 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:03.898235 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:03.898354 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:03.898307 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:05.522738 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:05.522699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:05.523398 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.522833 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:00:05.523398 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.522901 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs podName:59442fa9-d5a4-452c-bf14-f93d58af99dc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:37.522881121 +0000 UTC m=+66.201387990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs") pod "network-metrics-daemon-7bhkl" (UID: "59442fa9-d5a4-452c-bf14-f93d58af99dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:00:05.623989 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:05.623960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:05.624126 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.624072 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:00:05.624126 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.624085 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:00:05.624126 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.624094 2572 projected.go:194] Error preparing data for projected volume kube-api-access-4zjbf for pod openshift-network-diagnostics/network-check-target-bh7db: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:00:05.624236 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.624140 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf podName:0020e0fe-4923-4ecf-86ba-90de98fb3649 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:37.624126232 +0000 UTC m=+66.302633098 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4zjbf" (UniqueName: "kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf") pod "network-check-target-bh7db" (UID: "0020e0fe-4923-4ecf-86ba-90de98fb3649") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:00:05.898564 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:05.898481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:05.898717 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:05.898573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:05.898717 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.898688 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:05.898826 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:05.898746 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:07.898531 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:07.898499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:07.898951 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:07.898510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:07.898951 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:07.898619 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:07.898951 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:07.898681 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:09.898886 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:09.898856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:09.899275 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:09.898947 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:09.899275 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:09.899029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:09.899275 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:09.899117 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:11.049174 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:11.049148 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c854f8-33e5-46ea-aa35-7026190215b7" containerID="d2894e4f05d281763e239be8608d40bb0c6638836be34c6ee168b68821f53b5a" exitCode=0 Apr 16 14:00:11.049495 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:11.049199 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerDied","Data":"d2894e4f05d281763e239be8608d40bb0c6638836be34c6ee168b68821f53b5a"} Apr 16 14:00:11.914488 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:11.914455 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:11.914613 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:11.914576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:11.914613 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:11.914587 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:11.914725 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:11.914659 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:12.053360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:12.053319 2572 generic.go:358] "Generic (PLEG): container finished" podID="98c854f8-33e5-46ea-aa35-7026190215b7" containerID="1c1d47a387a18072ab3cbadf17a94f6819abb32d6864296ae8e3af5e325f4247" exitCode=0 Apr 16 14:00:12.053660 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:12.053365 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerDied","Data":"1c1d47a387a18072ab3cbadf17a94f6819abb32d6864296ae8e3af5e325f4247"} Apr 16 14:00:13.057835 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:13.057803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" event={"ID":"98c854f8-33e5-46ea-aa35-7026190215b7","Type":"ContainerStarted","Data":"a68863d89eca4d9141dbdbf06dcf90eea6140706822cfebb87a39f5ee07f470e"} Apr 16 14:00:13.082234 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:13.082193 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ghk8n" podStartSLOduration=5.876442424 podStartE2EDuration="42.082179956s" podCreationTimestamp="2026-04-16 13:59:31 +0000 UTC" firstStartedPulling="2026-04-16 13:59:34.092509383 +0000 UTC m=+2.771016254" lastFinishedPulling="2026-04-16 14:00:10.298246919 +0000 UTC m=+38.976753786" observedRunningTime="2026-04-16 14:00:13.081120272 +0000 UTC m=+41.759627209" watchObservedRunningTime="2026-04-16 14:00:13.082179956 +0000 UTC m=+41.760686843" Apr 16 14:00:13.179611 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:13.179581 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7bhkl"] Apr 16 14:00:13.179747 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:13.179731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:13.179875 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:13.179846 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:13.182324 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:13.182302 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bh7db"] Apr 16 14:00:13.182442 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:13.182424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:13.182516 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:13.182499 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:14.899051 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:14.899017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:14.899550 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:14.899017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:14.899550 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:14.899110 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bh7db" podUID="0020e0fe-4923-4ecf-86ba-90de98fb3649" Apr 16 14:00:14.899550 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:14.899196 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bhkl" podUID="59442fa9-d5a4-452c-bf14-f93d58af99dc" Apr 16 14:00:16.674717 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.674667 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-129.ec2.internal" event="NodeReady" Apr 16 14:00:16.675020 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.674772 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:00:16.730954 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.730930 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2nwg4"] Apr 16 14:00:16.765696 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.765675 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hq77c"] Apr 16 14:00:16.765825 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.765807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:16.769072 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.769050 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:00:16.769763 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.769719 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:00:16.769881 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.769772 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gjmb2\"" Apr 16 14:00:16.769881 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.769798 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:00:16.795725 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.795696 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2nwg4"] Apr 16 14:00:16.795725 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.795721 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hq77c"] Apr 16 14:00:16.795879 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.795735 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n4t6t"] Apr 16 14:00:16.795879 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.795836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:16.798286 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.798268 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:16.798550 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.798536 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:16.798597 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.798536 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:16.798810 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.798795 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h5f2n\"" Apr 16 14:00:16.799140 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.799125 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:16.822978 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.822952 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n4t6t"] Apr 16 14:00:16.823074 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.823066 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:16.825654 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.825638 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:00:16.825654 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.825649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-pfkw7\"" Apr 16 14:00:16.825782 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.825680 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:00:16.898235 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.898216 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:16.898317 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.898218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:16.900753 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.900732 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcmqf\"" Apr 16 14:00:16.900858 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.900790 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:16.900858 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.900810 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:16.900858 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.900838 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-snjtl\"" Apr 16 14:00:16.900858 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.900838 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:16.904071 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6477641f-7f99-4033-bbb6-4840e371fbd2-crio-socket\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:16.904071 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6477641f-7f99-4033-bbb6-4840e371fbd2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:16.904188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6477641f-7f99-4033-bbb6-4840e371fbd2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:16.904188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6477641f-7f99-4033-bbb6-4840e371fbd2-data-volume\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:16.904188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf86f\" (UniqueName: \"kubernetes.io/projected/6477641f-7f99-4033-bbb6-4840e371fbd2-kube-api-access-zf86f\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:16.904364 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42fcb6b4-c71f-4846-9e64-95662201e229-cert\") pod \"ingress-canary-2nwg4\" (UID: \"42fcb6b4-c71f-4846-9e64-95662201e229\") " pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:16.904364 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:16.904245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4pgp\" (UniqueName: \"kubernetes.io/projected/42fcb6b4-c71f-4846-9e64-95662201e229-kube-api-access-b4pgp\") pod \"ingress-canary-2nwg4\" (UID: \"42fcb6b4-c71f-4846-9e64-95662201e229\") " pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:17.005322 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6477641f-7f99-4033-bbb6-4840e371fbd2-crio-socket\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.005419 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6477641f-7f99-4033-bbb6-4840e371fbd2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.005457 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6477641f-7f99-4033-bbb6-4840e371fbd2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.005491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-config-volume\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.005491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005484 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdx5c\" (UniqueName: \"kubernetes.io/projected/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-kube-api-access-vdx5c\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.005586 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6477641f-7f99-4033-bbb6-4840e371fbd2-crio-socket\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.005586 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6477641f-7f99-4033-bbb6-4840e371fbd2-data-volume\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.005586 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf86f\" (UniqueName: \"kubernetes.io/projected/6477641f-7f99-4033-bbb6-4840e371fbd2-kube-api-access-zf86f\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.005698 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005654 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42fcb6b4-c71f-4846-9e64-95662201e229-cert\") pod \"ingress-canary-2nwg4\" (UID: \"42fcb6b4-c71f-4846-9e64-95662201e229\") " pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:17.005698 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-tmp-dir\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.005825 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4pgp\" (UniqueName: \"kubernetes.io/projected/42fcb6b4-c71f-4846-9e64-95662201e229-kube-api-access-b4pgp\") pod \"ingress-canary-2nwg4\" (UID: \"42fcb6b4-c71f-4846-9e64-95662201e229\") " pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:17.005825 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6477641f-7f99-4033-bbb6-4840e371fbd2-data-volume\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.006114 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-metrics-tls\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.006114 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.005995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6477641f-7f99-4033-bbb6-4840e371fbd2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.009347 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.009312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6477641f-7f99-4033-bbb6-4840e371fbd2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.009520 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.009504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42fcb6b4-c71f-4846-9e64-95662201e229-cert\") pod \"ingress-canary-2nwg4\" (UID: \"42fcb6b4-c71f-4846-9e64-95662201e229\") " pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:17.014316 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.014293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4pgp\" (UniqueName: \"kubernetes.io/projected/42fcb6b4-c71f-4846-9e64-95662201e229-kube-api-access-b4pgp\") pod \"ingress-canary-2nwg4\" (UID: \"42fcb6b4-c71f-4846-9e64-95662201e229\") " pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:17.014316 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.014308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf86f\" (UniqueName: \"kubernetes.io/projected/6477641f-7f99-4033-bbb6-4840e371fbd2-kube-api-access-zf86f\") pod \"insights-runtime-extractor-hq77c\" (UID: \"6477641f-7f99-4033-bbb6-4840e371fbd2\") " pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.075449 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.075434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2nwg4" Apr 16 14:00:17.104262 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.104209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hq77c" Apr 16 14:00:17.106897 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.106878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-config-volume\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.106982 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.106906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdx5c\" (UniqueName: \"kubernetes.io/projected/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-kube-api-access-vdx5c\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.106982 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.106927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-tmp-dir\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.106982 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.106969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-metrics-tls\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.107365 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.107323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-tmp-dir\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.107478 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.107463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-config-volume\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.109625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.109606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-metrics-tls\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.114610 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.114556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdx5c\" (UniqueName: \"kubernetes.io/projected/360ea35e-bf48-4c5d-aeb1-dbe4c67646c3-kube-api-access-vdx5c\") pod \"dns-default-n4t6t\" (UID: \"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3\") " pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.130666 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.130644 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:17.258451 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.258360 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2nwg4"] Apr 16 14:00:17.261749 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.261723 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hq77c"] Apr 16 14:00:17.262148 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:17.262117 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42fcb6b4_c71f_4846_9e64_95662201e229.slice/crio-d61515bac238cfd778720bf639ac2d5d0eb911c87fdca31924ac55c403bd0c27 WatchSource:0}: Error finding container d61515bac238cfd778720bf639ac2d5d0eb911c87fdca31924ac55c403bd0c27: Status 404 returned error can't find the container with id d61515bac238cfd778720bf639ac2d5d0eb911c87fdca31924ac55c403bd0c27 Apr 16 14:00:17.264667 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:17.264646 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6477641f_7f99_4033_bbb6_4840e371fbd2.slice/crio-061203fbb7d7f608962744991f9f34952428e31aaaecc2e1ef9cafbc3cd6ddc1 WatchSource:0}: Error finding container 061203fbb7d7f608962744991f9f34952428e31aaaecc2e1ef9cafbc3cd6ddc1: Status 404 returned error can't find the container with id 061203fbb7d7f608962744991f9f34952428e31aaaecc2e1ef9cafbc3cd6ddc1 Apr 16 14:00:17.266816 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:17.266776 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n4t6t"] Apr 16 14:00:17.269266 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:17.269244 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod360ea35e_bf48_4c5d_aeb1_dbe4c67646c3.slice/crio-83001576ad0239c23ca8b7b430c483b3be7642869b8dd534adab1d0548cf3829 WatchSource:0}: Error finding container 83001576ad0239c23ca8b7b430c483b3be7642869b8dd534adab1d0548cf3829: Status 404 returned error can't find the container with id 83001576ad0239c23ca8b7b430c483b3be7642869b8dd534adab1d0548cf3829 Apr 16 14:00:18.065789 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.065755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n4t6t" event={"ID":"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3","Type":"ContainerStarted","Data":"83001576ad0239c23ca8b7b430c483b3be7642869b8dd534adab1d0548cf3829"} Apr 16 14:00:18.066939 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.066917 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hq77c" event={"ID":"6477641f-7f99-4033-bbb6-4840e371fbd2","Type":"ContainerStarted","Data":"85312313c58413319f25d97810db75ae6d3a507f673c5335bbd2b6d2aad01520"} Apr 16 14:00:18.067027 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.066945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hq77c" event={"ID":"6477641f-7f99-4033-bbb6-4840e371fbd2","Type":"ContainerStarted","Data":"061203fbb7d7f608962744991f9f34952428e31aaaecc2e1ef9cafbc3cd6ddc1"} Apr 16 14:00:18.067807 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.067788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2nwg4" event={"ID":"42fcb6b4-c71f-4846-9e64-95662201e229","Type":"ContainerStarted","Data":"d61515bac238cfd778720bf639ac2d5d0eb911c87fdca31924ac55c403bd0c27"} Apr 16 14:00:18.150231 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.150206 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65bf79df8-6cjvd"] Apr 16 14:00:18.170823 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.170800 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65bf79df8-6cjvd"] Apr 16 14:00:18.170939 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.170924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.173528 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.173511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:00:18.174626 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:00:18.174726 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:00:18.174726 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174665 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:00:18.174726 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174686 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sb62k\"" Apr 16 14:00:18.174726 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:00:18.174726 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174645 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:00:18.174993 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.174972 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:00:18.178473 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.178454 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:00:18.315624 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-serving-cert\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.315728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-oauth-config\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.315728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-config\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.315728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-service-ca\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.315728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-trusted-ca-bundle\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.315924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdmmp\" (UniqueName: \"kubernetes.io/projected/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-kube-api-access-pdmmp\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.315924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.315782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-oauth-serving-cert\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.416796 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.416759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-oauth-config\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.416910 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.416817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-config\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.416910 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.416844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-service-ca\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.416910 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.416879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-trusted-ca-bundle\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.417061 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.416928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmmp\" (UniqueName: \"kubernetes.io/projected/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-kube-api-access-pdmmp\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.417061 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.416963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-oauth-serving-cert\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.417061 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.417056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-serving-cert\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.418877 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.418851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-oauth-serving-cert\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.418977 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.418873 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-config\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.419425 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.419404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-service-ca\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.421143 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.421122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-oauth-config\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.421352 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.421315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-serving-cert\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.422754 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.422730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-trusted-ca-bundle\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.426205 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.426177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmmp\" (UniqueName: \"kubernetes.io/projected/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-kube-api-access-pdmmp\") pod \"console-65bf79df8-6cjvd\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.480900 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.480612 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:18.620429 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:18.620388 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65bf79df8-6cjvd"] Apr 16 14:00:18.624000 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:18.623972 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2408a9_8a08_4dce_83b6_d4e4cfefbb01.slice/crio-4bbd2642fdf2929501a6788938da8ab04f42660a3abd1e347f55fb01d189ad3d WatchSource:0}: Error finding container 4bbd2642fdf2929501a6788938da8ab04f42660a3abd1e347f55fb01d189ad3d: Status 404 returned error can't find the container with id 4bbd2642fdf2929501a6788938da8ab04f42660a3abd1e347f55fb01d189ad3d Apr 16 14:00:19.071625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:19.071592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65bf79df8-6cjvd" event={"ID":"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01","Type":"ContainerStarted","Data":"4bbd2642fdf2929501a6788938da8ab04f42660a3abd1e347f55fb01d189ad3d"} Apr 16 14:00:21.078622 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:21.078423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hq77c" event={"ID":"6477641f-7f99-4033-bbb6-4840e371fbd2","Type":"ContainerStarted","Data":"9599c96ab372b0fb6d302b185cf6fe9e52f1b387359b42d0d1639af67f7fccc2"} Apr 16 14:00:21.080816 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:21.080123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2nwg4" event={"ID":"42fcb6b4-c71f-4846-9e64-95662201e229","Type":"ContainerStarted","Data":"f2669196e1b4df472b55be8aef392c0352141fe1cdfb3490f95ffdca357b7038"} Apr 16 14:00:21.082978 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:21.082942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n4t6t" event={"ID":"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3","Type":"ContainerStarted","Data":"ab8f2daa0226282ebe4a16f5f590a74e36b7dce66edac87deea0225cde837b48"} Apr 16 14:00:21.918101 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:21.918051 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2nwg4" podStartSLOduration=2.563005812 podStartE2EDuration="5.918032788s" podCreationTimestamp="2026-04-16 14:00:16 +0000 UTC" firstStartedPulling="2026-04-16 14:00:17.263924339 +0000 UTC m=+45.942431206" lastFinishedPulling="2026-04-16 14:00:20.618951317 +0000 UTC m=+49.297458182" observedRunningTime="2026-04-16 14:00:21.099566392 +0000 UTC m=+49.778073279" watchObservedRunningTime="2026-04-16 14:00:21.918032788 +0000 UTC m=+50.596539677" Apr 16 14:00:22.086025 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:22.085991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n4t6t" event={"ID":"360ea35e-bf48-4c5d-aeb1-dbe4c67646c3","Type":"ContainerStarted","Data":"c277bd8efaa15f5da2828423c1c86fe7c0e737910421d928639dfc5f8c5645b4"} Apr 16 14:00:22.104054 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:22.104015 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n4t6t" podStartSLOduration=2.756235819 podStartE2EDuration="6.104003128s" podCreationTimestamp="2026-04-16 14:00:16 +0000 UTC" firstStartedPulling="2026-04-16 14:00:17.271182607 +0000 UTC m=+45.949689473" lastFinishedPulling="2026-04-16 14:00:20.618949916 +0000 UTC m=+49.297456782" observedRunningTime="2026-04-16 14:00:22.103394544 +0000 UTC m=+50.781901431" watchObservedRunningTime="2026-04-16 14:00:22.104003128 +0000 UTC m=+50.782510023" Apr 16 14:00:23.089059 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:23.088784 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:24.092403 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:24.092368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65bf79df8-6cjvd" event={"ID":"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01","Type":"ContainerStarted","Data":"cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081"} Apr 16 14:00:24.107779 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:24.107738 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65bf79df8-6cjvd" podStartSLOduration=1.057069757 podStartE2EDuration="6.107724387s" podCreationTimestamp="2026-04-16 14:00:18 +0000 UTC" firstStartedPulling="2026-04-16 14:00:18.626278797 +0000 UTC m=+47.304785669" lastFinishedPulling="2026-04-16 14:00:23.676933417 +0000 UTC m=+52.355440299" observedRunningTime="2026-04-16 14:00:24.107443511 +0000 UTC m=+52.785950400" watchObservedRunningTime="2026-04-16 14:00:24.107724387 +0000 UTC m=+52.786231274" Apr 16 14:00:25.096971 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:25.096925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hq77c" event={"ID":"6477641f-7f99-4033-bbb6-4840e371fbd2","Type":"ContainerStarted","Data":"b0bfdbbbba7cacaae8b028e6f29668820521e51988cdd66504658d5de26492ab"} Apr 16 14:00:25.114559 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:25.114519 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hq77c" podStartSLOduration=1.9264233499999999 podStartE2EDuration="9.114505728s" podCreationTimestamp="2026-04-16 14:00:16 +0000 UTC" firstStartedPulling="2026-04-16 14:00:17.417140743 +0000 UTC m=+46.095647610" lastFinishedPulling="2026-04-16 14:00:24.605223106 +0000 UTC m=+53.283729988" observedRunningTime="2026-04-16 14:00:25.113454415 +0000 UTC m=+53.791961297" watchObservedRunningTime="2026-04-16 14:00:25.114505728 +0000 UTC m=+53.793012648" Apr 16 14:00:26.445586 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.445557 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh"] Apr 16 14:00:26.448611 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.448596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.452059 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.452036 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:00:26.452271 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.452257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:00:26.453425 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.453406 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-z5r89\"" Apr 16 14:00:26.453539 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.453438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:26.453539 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.453455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:00:26.453652 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.453539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:26.459940 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.459916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh"] Apr 16 14:00:26.463499 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.463474 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9k4bn"] Apr 16 14:00:26.466370 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.466326 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-nht7j"] Apr 16 14:00:26.466483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.466465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.469249 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.469229 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:00:26.469346 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.469258 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nmjsm\"" Apr 16 14:00:26.469346 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.469276 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:00:26.469463 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.469398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:00:26.469519 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.469508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.471887 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.471872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:00:26.472220 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.472200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:00:26.472323 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.472225 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:00:26.472323 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.472208 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4mnqx\"" Apr 16 14:00:26.476532 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.476516 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-nht7j"] Apr 16 14:00:26.478772 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.478749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-textfile\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.478889 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.478792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bde45541-f349-4350-8270-52b3eaad1325-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.478889 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.478853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.479000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.478888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.479000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.478918 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stg7k\" (UniqueName: \"kubernetes.io/projected/c82633dc-21cb-4f43-9155-2073ed72f663-kube-api-access-stg7k\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.478949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-root\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzpm\" (UniqueName: \"kubernetes.io/projected/bde45541-f349-4350-8270-52b3eaad1325-kube-api-access-tzzpm\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.479141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2kzm\" (UniqueName: \"kubernetes.io/projected/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-kube-api-access-g2kzm\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-accelerators-collector-config\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-wtmp\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c82633dc-21cb-4f43-9155-2073ed72f663-metrics-client-ca\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-sys\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bde45541-f349-4350-8270-52b3eaad1325-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.479447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.479421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.580152 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.580291 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580162 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.580291 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-wtmp\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580291 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c82633dc-21cb-4f43-9155-2073ed72f663-metrics-client-ca\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580291 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-sys\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580291 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bde45541-f349-4350-8270-52b3eaad1325-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.580291 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-textfile\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-sys\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bde45541-f349-4350-8270-52b3eaad1325-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-wtmp\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stg7k\" (UniqueName: \"kubernetes.io/projected/c82633dc-21cb-4f43-9155-2073ed72f663-kube-api-access-stg7k\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-root\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.580625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzpm\" (UniqueName: \"kubernetes.io/projected/bde45541-f349-4350-8270-52b3eaad1325-kube-api-access-tzzpm\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2kzm\" (UniqueName: \"kubernetes.io/projected/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-kube-api-access-g2kzm\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-accelerators-collector-config\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bde45541-f349-4350-8270-52b3eaad1325-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.580914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c82633dc-21cb-4f43-9155-2073ed72f663-metrics-client-ca\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.581046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bde45541-f349-4350-8270-52b3eaad1325-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.581120 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:26.581087 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:26.581153 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-tls podName:bde45541-f349-4350-8270-52b3eaad1325 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.081135001 +0000 UTC m=+55.759641886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-nht7j" (UID: "bde45541-f349-4350-8270-52b3eaad1325") : secret "kube-state-metrics-tls" not found Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:26.581232 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.581250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-textfile\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:26.581295 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls podName:c82633dc-21cb-4f43-9155-2073ed72f663 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.081276794 +0000 UTC m=+55.759783665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls") pod "node-exporter-9k4bn" (UID: "c82633dc-21cb-4f43-9155-2073ed72f663") : secret "node-exporter-tls" not found Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.581394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:26.581429 2572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:00:26.581471 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.581455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c82633dc-21cb-4f43-9155-2073ed72f663-root\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.581760 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:26.581479 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-tls podName:bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.081466489 +0000 UTC m=+55.759973369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-wjkdh" (UID: "bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd") : secret "openshift-state-metrics-tls" not found Apr 16 14:00:26.581830 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.581810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-accelerators-collector-config\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.582395 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.582374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.584574 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.584551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.584574 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.584569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.584720 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.584553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:26.589675 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.589654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stg7k\" (UniqueName: \"kubernetes.io/projected/c82633dc-21cb-4f43-9155-2073ed72f663-kube-api-access-stg7k\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:26.589924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.589901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzpm\" (UniqueName: \"kubernetes.io/projected/bde45541-f349-4350-8270-52b3eaad1325-kube-api-access-tzzpm\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:26.590040 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:26.590025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2kzm\" (UniqueName: \"kubernetes.io/projected/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-kube-api-access-g2kzm\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:27.084287 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.084261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:27.084427 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.084325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:27.084427 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.084374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:27.084427 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:27.084411 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:00:27.084532 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:27.084474 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls podName:c82633dc-21cb-4f43-9155-2073ed72f663 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:28.084455504 +0000 UTC m=+56.762962371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls") pod "node-exporter-9k4bn" (UID: "c82633dc-21cb-4f43-9155-2073ed72f663") : secret "node-exporter-tls" not found Apr 16 14:00:27.086758 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.086732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-wjkdh\" (UID: \"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:27.086840 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.086788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde45541-f349-4350-8270-52b3eaad1325-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-nht7j\" (UID: \"bde45541-f349-4350-8270-52b3eaad1325\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:27.357685 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.357619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" Apr 16 14:00:27.382373 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.382346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" Apr 16 14:00:27.494602 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.494575 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh"] Apr 16 14:00:27.500390 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.500185 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:00:27.506589 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.506572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.508915 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.508877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:00:27.508996 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.508925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:00:27.509052 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509031 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:00:27.509104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509094 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:00:27.509146 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509110 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:00:27.509409 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:00:27.509530 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509431 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:00:27.509530 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509391 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:00:27.509674 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509561 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6ld6d\"" Apr 16 14:00:27.509788 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.509773 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:00:27.516941 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.516903 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-nht7j"] Apr 16 14:00:27.518862 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.518840 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:00:27.522310 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:27.522287 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde45541_f349_4350_8270_52b3eaad1325.slice/crio-b67f7610d37da0a8d474892fb9aabf6f9367f6e14d191777f46b151e82841ccb WatchSource:0}: Error finding container b67f7610d37da0a8d474892fb9aabf6f9367f6e14d191777f46b151e82841ccb: Status 404 returned error can't find the container with id b67f7610d37da0a8d474892fb9aabf6f9367f6e14d191777f46b151e82841ccb Apr 16 14:00:27.588408 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588524 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588524 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-web-config\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588633 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsbd2\" (UniqueName: \"kubernetes.io/projected/43221ea1-101c-4ab1-874d-4da01f9e5d7a-kube-api-access-nsbd2\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588633 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588633 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588786 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43221ea1-101c-4ab1-874d-4da01f9e5d7a-config-out\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588786 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588786 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588732 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588786 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43221ea1-101c-4ab1-874d-4da01f9e5d7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588990 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588990 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.588990 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.588910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.689570 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.689719 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.689719 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-web-config\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.689872 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsbd2\" (UniqueName: \"kubernetes.io/projected/43221ea1-101c-4ab1-874d-4da01f9e5d7a-kube-api-access-nsbd2\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.689872 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.689872 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:27.689810 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-trusted-ca-bundle podName:43221ea1-101c-4ab1-874d-4da01f9e5d7a nodeName:}" failed. No retries permitted until 2026-04-16 14:00:28.189778918 +0000 UTC m=+56.868285791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "43221ea1-101c-4ab1-874d-4da01f9e5d7a") : configmap references non-existent config key: ca-bundle.crt Apr 16 14:00:27.690031 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690031 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.689933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43221ea1-101c-4ab1-874d-4da01f9e5d7a-config-out\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690126 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690126 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690228 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43221ea1-101c-4ab1-874d-4da01f9e5d7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690228 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.690849 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.690819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.692890 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.692870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43221ea1-101c-4ab1-874d-4da01f9e5d7a-config-out\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.693182 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.692978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.693297 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.693083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.693416 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.693395 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.693416 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.693286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43221ea1-101c-4ab1-874d-4da01f9e5d7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.694362 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.694316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.694446 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.694318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.694517 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.694502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.694666 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.694649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43221ea1-101c-4ab1-874d-4da01f9e5d7a-web-config\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:27.699603 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:27.699585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsbd2\" (UniqueName: \"kubernetes.io/projected/43221ea1-101c-4ab1-874d-4da01f9e5d7a-kube-api-access-nsbd2\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:28.093341 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.093318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:28.095304 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.095281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c82633dc-21cb-4f43-9155-2073ed72f663-node-exporter-tls\") pod \"node-exporter-9k4bn\" (UID: \"c82633dc-21cb-4f43-9155-2073ed72f663\") " pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:28.106166 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.106138 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" event={"ID":"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd","Type":"ContainerStarted","Data":"081433b1032468b05a4c83e3cb809e30ddea1d145a3f01228bb947aaa5c13ebc"} Apr 16 14:00:28.106261 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.106172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" event={"ID":"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd","Type":"ContainerStarted","Data":"28666d8f7a700cc0b0056a3bc4f1b79b14bd9a89c57bf123528a71afd24975af"} Apr 16 14:00:28.106261 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.106189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" event={"ID":"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd","Type":"ContainerStarted","Data":"aba625b22b24c8cbe5809088821a6a2fa295ef96ca01ec4ba6345fea29955a88"} Apr 16 14:00:28.107119 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.107100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" event={"ID":"bde45541-f349-4350-8270-52b3eaad1325","Type":"ContainerStarted","Data":"b67f7610d37da0a8d474892fb9aabf6f9367f6e14d191777f46b151e82841ccb"} Apr 16 14:00:28.194275 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.194247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:28.194877 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.194860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43221ea1-101c-4ab1-874d-4da01f9e5d7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43221ea1-101c-4ab1-874d-4da01f9e5d7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:28.276452 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.276425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9k4bn" Apr 16 14:00:28.284575 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:28.284543 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82633dc_21cb_4f43_9155_2073ed72f663.slice/crio-d5b35b1b19c45cade89104516af908296e4694bc01bc45dcc02629e7bc48e6cb WatchSource:0}: Error finding container d5b35b1b19c45cade89104516af908296e4694bc01bc45dcc02629e7bc48e6cb: Status 404 returned error can't find the container with id d5b35b1b19c45cade89104516af908296e4694bc01bc45dcc02629e7bc48e6cb Apr 16 14:00:28.416569 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.416519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:00:28.481115 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.481089 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:28.481236 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.481148 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:28.482206 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.482176 2572 patch_prober.go:28] interesting pod/console-65bf79df8-6cjvd container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.8:8443/health\": dial tcp 10.134.0.8:8443: connect: connection refused" start-of-body= Apr 16 14:00:28.482255 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.482225 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-65bf79df8-6cjvd" podUID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" containerName="console" probeResult="failure" output="Get \"https://10.134.0.8:8443/health\": dial tcp 10.134.0.8:8443: connect: connection refused" Apr 16 14:00:28.538103 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:28.538084 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:00:28.539403 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:28.539383 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43221ea1_101c_4ab1_874d_4da01f9e5d7a.slice/crio-0e98e7210aa7ef84971c77d3f394f4c5c885f5efd7b8a0c3d7617d67115e4159 WatchSource:0}: Error finding container 0e98e7210aa7ef84971c77d3f394f4c5c885f5efd7b8a0c3d7617d67115e4159: Status 404 returned error can't find the container with id 0e98e7210aa7ef84971c77d3f394f4c5c885f5efd7b8a0c3d7617d67115e4159 Apr 16 14:00:29.114377 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.114330 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"0e98e7210aa7ef84971c77d3f394f4c5c885f5efd7b8a0c3d7617d67115e4159"} Apr 16 14:00:29.115354 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.115311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k4bn" event={"ID":"c82633dc-21cb-4f43-9155-2073ed72f663","Type":"ContainerStarted","Data":"d5b35b1b19c45cade89104516af908296e4694bc01bc45dcc02629e7bc48e6cb"} Apr 16 14:00:29.412352 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.412259 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-76df994fb-g25kv"] Apr 16 14:00:29.417248 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.417227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.419781 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.419761 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:00:29.419877 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.419844 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-brjpw\"" Apr 16 14:00:29.420457 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.419982 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:00:29.420457 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.420157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:00:29.420457 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.420203 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:00:29.420457 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.420307 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:00:29.420781 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.420723 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-10ij7lgv5i70d\"" Apr 16 14:00:29.426051 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.426029 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-76df994fb-g25kv"] Apr 16 14:00:29.503432 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503522 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4q2\" (UniqueName: \"kubernetes.io/projected/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-kube-api-access-vr4q2\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503575 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-grpc-tls\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-metrics-client-ca\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503636 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-tls\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.503716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.503707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605143 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-grpc-tls\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-metrics-client-ca\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-tls\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.605650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4q2\" (UniqueName: \"kubernetes.io/projected/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-kube-api-access-vr4q2\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.606154 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.605848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-metrics-client-ca\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.608573 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.608552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.608843 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.608785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-tls\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.609466 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.609444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.609518 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.609467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.609637 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.609611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-grpc-tls\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.610832 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.610804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.613599 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.613581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4q2\" (UniqueName: \"kubernetes.io/projected/aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be-kube-api-access-vr4q2\") pod \"thanos-querier-76df994fb-g25kv\" (UID: \"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be\") " pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.728886 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.728852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:29.871546 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:29.871516 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-76df994fb-g25kv"] Apr 16 14:00:30.076467 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:30.076419 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0c8ff6_8266_47ba_aec5_dfa7c4b0d2be.slice/crio-bd056c3165eed7d3c6d1a2fdfc771edd72c9bc7cafd81a63854b5a4697c30dfc WatchSource:0}: Error finding container bd056c3165eed7d3c6d1a2fdfc771edd72c9bc7cafd81a63854b5a4697c30dfc: Status 404 returned error can't find the container with id bd056c3165eed7d3c6d1a2fdfc771edd72c9bc7cafd81a63854b5a4697c30dfc Apr 16 14:00:30.094649 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.094628 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n4t6t" Apr 16 14:00:30.120003 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.119962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"bd056c3165eed7d3c6d1a2fdfc771edd72c9bc7cafd81a63854b5a4697c30dfc"} Apr 16 14:00:30.121540 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.121512 2572 generic.go:358] "Generic (PLEG): container finished" podID="c82633dc-21cb-4f43-9155-2073ed72f663" containerID="a3dc2fb0420224504370035b1454244385e01c221180b47d7111031d291814b9" exitCode=0 Apr 16 14:00:30.121647 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.121560 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k4bn" event={"ID":"c82633dc-21cb-4f43-9155-2073ed72f663","Type":"ContainerDied","Data":"a3dc2fb0420224504370035b1454244385e01c221180b47d7111031d291814b9"} Apr 16 14:00:30.741053 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.741026 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8769677c6-4zv5d"] Apr 16 14:00:30.755522 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.755500 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8769677c6-4zv5d"] Apr 16 14:00:30.755667 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.755603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.759497 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.759324 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-cr9pn\"" Apr 16 14:00:30.759497 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.759360 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:00:30.759686 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.759660 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:00:30.759743 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.759727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:00:30.759869 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.759850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-d3uc8guen4gl6\"" Apr 16 14:00:30.759975 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.759957 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:00:30.818186 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f20570e0-d258-40e7-94da-6a651303df3e-audit-log\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.818278 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-secret-metrics-server-client-certs\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.818326 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-client-ca-bundle\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.818398 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f20570e0-d258-40e7-94da-6a651303df3e-metrics-server-audit-profiles\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.818458 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-secret-metrics-server-tls\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.818512 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489d2\" (UniqueName: \"kubernetes.io/projected/f20570e0-d258-40e7-94da-6a651303df3e-kube-api-access-489d2\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.818512 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.818500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20570e0-d258-40e7-94da-6a651303df3e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.919729 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.919698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-client-ca-bundle\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.919826 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.919739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f20570e0-d258-40e7-94da-6a651303df3e-metrics-server-audit-profiles\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.919929 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.919910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-secret-metrics-server-tls\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.919967 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.919951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-489d2\" (UniqueName: \"kubernetes.io/projected/f20570e0-d258-40e7-94da-6a651303df3e-kube-api-access-489d2\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.920008 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.919990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20570e0-d258-40e7-94da-6a651303df3e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.920041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.920019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f20570e0-d258-40e7-94da-6a651303df3e-audit-log\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.920070 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.920059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-secret-metrics-server-client-certs\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.920522 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.920473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f20570e0-d258-40e7-94da-6a651303df3e-audit-log\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.920808 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.920751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20570e0-d258-40e7-94da-6a651303df3e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.920921 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.920896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f20570e0-d258-40e7-94da-6a651303df3e-metrics-server-audit-profiles\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.922633 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.922605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-secret-metrics-server-tls\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.922888 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.922869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-secret-metrics-server-client-certs\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.922957 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.922875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20570e0-d258-40e7-94da-6a651303df3e-client-ca-bundle\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:30.927762 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:30.927735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-489d2\" (UniqueName: \"kubernetes.io/projected/f20570e0-d258-40e7-94da-6a651303df3e-kube-api-access-489d2\") pod \"metrics-server-8769677c6-4zv5d\" (UID: \"f20570e0-d258-40e7-94da-6a651303df3e\") " pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:31.044043 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.044018 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2vdts" Apr 16 14:00:31.066926 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.066844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:31.126184 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.126153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" event={"ID":"bde45541-f349-4350-8270-52b3eaad1325","Type":"ContainerStarted","Data":"5b88aa4abfd9c0ae11948c9758fce11765342a36b542758b11dd6fa2b2a875cf"} Apr 16 14:00:31.127931 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.127887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"19f82677c0e2dd90efac0a662bd4a72ebcc934993b55da9683af8a3b25a84c97"} Apr 16 14:00:31.132129 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.132100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k4bn" event={"ID":"c82633dc-21cb-4f43-9155-2073ed72f663","Type":"ContainerStarted","Data":"9d1c9663ecb8420b6023c2acee0698c933f297c36be9800842a0b5f02fb0e040"} Apr 16 14:00:31.132251 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.132136 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k4bn" event={"ID":"c82633dc-21cb-4f43-9155-2073ed72f663","Type":"ContainerStarted","Data":"9ecb1754460af0d2cf4a8cb11f170885744f00c62bd23d381f64d729d47b7bc7"} Apr 16 14:00:31.177353 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.177284 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9k4bn" podStartSLOduration=3.961869525 podStartE2EDuration="5.177270767s" podCreationTimestamp="2026-04-16 14:00:26 +0000 UTC" firstStartedPulling="2026-04-16 14:00:28.28618463 +0000 UTC m=+56.964691496" lastFinishedPulling="2026-04-16 14:00:29.501585872 +0000 UTC m=+58.180092738" observedRunningTime="2026-04-16 14:00:31.176265085 +0000 UTC m=+59.854771974" watchObservedRunningTime="2026-04-16 14:00:31.177270767 +0000 UTC m=+59.855777655" Apr 16 14:00:31.428389 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:31.428366 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8769677c6-4zv5d"] Apr 16 14:00:31.433752 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:31.433725 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20570e0_d258_40e7_94da_6a651303df3e.slice/crio-93985bc6be5ee65403158a7501f650af0678ff6584e806a05049fbc854244ef4 WatchSource:0}: Error finding container 93985bc6be5ee65403158a7501f650af0678ff6584e806a05049fbc854244ef4: Status 404 returned error can't find the container with id 93985bc6be5ee65403158a7501f650af0678ff6584e806a05049fbc854244ef4 Apr 16 14:00:32.136834 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.136801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" event={"ID":"bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd","Type":"ContainerStarted","Data":"0b1b2a2a15d166e394eaad456e3747d5c029c18ee0907a6ce5411a8768d8f4ff"} Apr 16 14:00:32.138055 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.138022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" event={"ID":"f20570e0-d258-40e7-94da-6a651303df3e","Type":"ContainerStarted","Data":"93985bc6be5ee65403158a7501f650af0678ff6584e806a05049fbc854244ef4"} Apr 16 14:00:32.140139 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.140113 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" event={"ID":"bde45541-f349-4350-8270-52b3eaad1325","Type":"ContainerStarted","Data":"e8470189f33eb23ee61d9c525012075b503562a83b95d77a699533a836805064"} Apr 16 14:00:32.140254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.140146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" event={"ID":"bde45541-f349-4350-8270-52b3eaad1325","Type":"ContainerStarted","Data":"87bedc09700271336b74eb27f0eda2a8fe9fb98f3199f75fa818659b566799f4"} Apr 16 14:00:32.141582 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.141557 2572 generic.go:358] "Generic (PLEG): container finished" podID="43221ea1-101c-4ab1-874d-4da01f9e5d7a" containerID="19f82677c0e2dd90efac0a662bd4a72ebcc934993b55da9683af8a3b25a84c97" exitCode=0 Apr 16 14:00:32.141686 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.141650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerDied","Data":"19f82677c0e2dd90efac0a662bd4a72ebcc934993b55da9683af8a3b25a84c97"} Apr 16 14:00:32.157611 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.157565 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-wjkdh" podStartSLOduration=2.145168779 podStartE2EDuration="6.157550575s" podCreationTimestamp="2026-04-16 14:00:26 +0000 UTC" firstStartedPulling="2026-04-16 14:00:27.635914769 +0000 UTC m=+56.314421635" lastFinishedPulling="2026-04-16 14:00:31.648296547 +0000 UTC m=+60.326803431" observedRunningTime="2026-04-16 14:00:32.157021198 +0000 UTC m=+60.835528086" watchObservedRunningTime="2026-04-16 14:00:32.157550575 +0000 UTC m=+60.836057467" Apr 16 14:00:32.179592 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.179550 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-nht7j" podStartSLOduration=2.8869976299999998 podStartE2EDuration="6.179538921s" podCreationTimestamp="2026-04-16 14:00:26 +0000 UTC" firstStartedPulling="2026-04-16 14:00:27.523829109 +0000 UTC m=+56.202335975" lastFinishedPulling="2026-04-16 14:00:30.816370397 +0000 UTC m=+59.494877266" observedRunningTime="2026-04-16 14:00:32.177380343 +0000 UTC m=+60.855887257" watchObservedRunningTime="2026-04-16 14:00:32.179538921 +0000 UTC m=+60.858045808" Apr 16 14:00:32.601539 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.601500 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:00:32.605833 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.605811 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.608856 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.608908 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.608923 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.608963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.608855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609087 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609166 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cspp7\"" Apr 16 14:00:32.609292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609245 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:00:32.609766 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609604 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:00:32.609766 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609630 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:00:32.609868 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609773 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6l41dur4qumcg\"" Apr 16 14:00:32.609976 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609950 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:00:32.610092 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.609993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:00:32.611930 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.611903 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:00:32.618434 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.618413 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:00:32.634018 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.633994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634148 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634036 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634148 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634148 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634148 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-web-config\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634269 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-config\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634313 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2577532-0f05-482a-8c50-c96bf606f03d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmjhw\" (UniqueName: \"kubernetes.io/projected/e2577532-0f05-482a-8c50-c96bf606f03d-kube-api-access-wmjhw\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2577532-0f05-482a-8c50-c96bf606f03d-config-out\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.634672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.634600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.735647 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.735795 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-web-config\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.735795 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-config\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.735795 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2577532-0f05-482a-8c50-c96bf606f03d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.735795 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.735795 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmjhw\" (UniqueName: \"kubernetes.io/projected/e2577532-0f05-482a-8c50-c96bf606f03d-kube-api-access-wmjhw\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2577532-0f05-482a-8c50-c96bf606f03d-config-out\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.735985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736782 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736481 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736782 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.736507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.736782 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:00:32.736774 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-trusted-ca-bundle podName:e2577532-0f05-482a-8c50-c96bf606f03d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:33.236751076 +0000 UTC m=+61.915257950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "e2577532-0f05-482a-8c50-c96bf606f03d") : configmap references non-existent config key: ca-bundle.crt Apr 16 14:00:32.740015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.739990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.740192 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.740168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.741028 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.741001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.742134 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.742110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-web-config\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.742432 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.742412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.743188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.742827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2577532-0f05-482a-8c50-c96bf606f03d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.743188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.743129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.743307 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.743188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-config\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.743307 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.743204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.743468 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.743444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.744566 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.744500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2577532-0f05-482a-8c50-c96bf606f03d-config-out\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.744566 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.744525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.744671 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.744643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.745104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.745080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e2577532-0f05-482a-8c50-c96bf606f03d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.753416 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:32.753370 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmjhw\" (UniqueName: \"kubernetes.io/projected/e2577532-0f05-482a-8c50-c96bf606f03d-kube-api-access-wmjhw\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:33.147788 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:33.147716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" event={"ID":"f20570e0-d258-40e7-94da-6a651303df3e","Type":"ContainerStarted","Data":"cab05189ede0a51c5dc14e18f4a18c5b7e63411d2840ea275168db30799b06a2"} Apr 16 14:00:33.242284 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:33.241122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:33.242740 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:33.242693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2577532-0f05-482a-8c50-c96bf606f03d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e2577532-0f05-482a-8c50-c96bf606f03d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:33.518968 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:33.518939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:34.074720 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.074675 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" podStartSLOduration=2.677401571 podStartE2EDuration="4.074659949s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.435541072 +0000 UTC m=+60.114047938" lastFinishedPulling="2026-04-16 14:00:32.832799448 +0000 UTC m=+61.511306316" observedRunningTime="2026-04-16 14:00:33.167623885 +0000 UTC m=+61.846130774" watchObservedRunningTime="2026-04-16 14:00:34.074659949 +0000 UTC m=+62.753166837" Apr 16 14:00:34.075437 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.075395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:00:34.152101 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.152035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"32dc50873fd5c226af5af917492e49641597eff8b896e7a946b12ec4ae427307"} Apr 16 14:00:34.155058 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.155029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"ed8aa5da6228d2eade1f4a61ef273a1fe6a01afb6c218afe9d38c3089b82bd27"} Apr 16 14:00:34.155155 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.155063 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"ae148635531a2697bd7d335e2fb8c55a7624e534939f4a66ef4a80b3cc57be1e"} Apr 16 14:00:34.157694 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.157620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"99cd654cf71bf45bc9307d6cc44d9d3f946d3fc3729ee11e995150807287a41f"} Apr 16 14:00:34.157694 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:34.157652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"575965d2d1899ec4ec996e896b7abe3f3e911daabdc906abf822b055e03f624c"} Apr 16 14:00:35.161279 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.161206 2572 generic.go:358] "Generic (PLEG): container finished" podID="e2577532-0f05-482a-8c50-c96bf606f03d" containerID="7db729c334ffe286d0fc9d676500d66f0741726c34bdfdcd3b9bb5c123811820" exitCode=0 Apr 16 14:00:35.161679 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.161281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerDied","Data":"7db729c334ffe286d0fc9d676500d66f0741726c34bdfdcd3b9bb5c123811820"} Apr 16 14:00:35.166706 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.166677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"67277ace521ebb29d9f97f5401ca99b2d6457e82accf7c36c813d3ee708cd40c"} Apr 16 14:00:35.166805 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.166711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"f083fd11720693cc56e7e874da84c6872a95c490179fabe1d8dc9534408e6011"} Apr 16 14:00:35.166805 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.166725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"8b1b539767e0d9e779773e2ffadfc0f4e15b3845cf2a48f87f30ff6df3e38715"} Apr 16 14:00:35.169172 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.169145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"62e2347523e117bd7f05f5669a64f07a7c8e01f9d45984806cbf5a2324485492"} Apr 16 14:00:35.169264 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.169179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"2fb9b398724ef59df482ea987d01aaa04ca4ed0ce7326eb995cb8eced15dd8a6"} Apr 16 14:00:35.169264 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.169193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"0ac8558d16bf061519bdbebc08987814096a2fe9d259accdeaa02bee65fa0f50"} Apr 16 14:00:35.169264 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.169205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" event={"ID":"aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be","Type":"ContainerStarted","Data":"3843623a3b66c86866abec6d11ea2dc750b3631299685ab4cb931155c231dbdc"} Apr 16 14:00:35.169390 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.169308 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:35.207370 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:35.207304 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" podStartSLOduration=1.402318124 podStartE2EDuration="6.207293299s" podCreationTimestamp="2026-04-16 14:00:29 +0000 UTC" firstStartedPulling="2026-04-16 14:00:30.078883865 +0000 UTC m=+58.757390731" lastFinishedPulling="2026-04-16 14:00:34.883859031 +0000 UTC m=+63.562365906" observedRunningTime="2026-04-16 14:00:35.206785451 +0000 UTC m=+63.885292356" watchObservedRunningTime="2026-04-16 14:00:35.207293299 +0000 UTC m=+63.885800181" Apr 16 14:00:36.179788 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:36.179754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43221ea1-101c-4ab1-874d-4da01f9e5d7a","Type":"ContainerStarted","Data":"1dc54e084b4afe0ecae1c01696df0f53c9776134fcad46fc76291a2752da10e1"} Apr 16 14:00:36.206495 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:36.206451 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.425932296 podStartE2EDuration="9.206438967s" podCreationTimestamp="2026-04-16 14:00:27 +0000 UTC" firstStartedPulling="2026-04-16 14:00:28.541316955 +0000 UTC m=+57.219823822" lastFinishedPulling="2026-04-16 14:00:35.321823611 +0000 UTC m=+64.000330493" observedRunningTime="2026-04-16 14:00:36.205403377 +0000 UTC m=+64.883910284" watchObservedRunningTime="2026-04-16 14:00:36.206438967 +0000 UTC m=+64.884945857" Apr 16 14:00:37.581672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.581637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:37.583910 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.583884 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:37.595080 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.595052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59442fa9-d5a4-452c-bf14-f93d58af99dc-metrics-certs\") pod \"network-metrics-daemon-7bhkl\" (UID: \"59442fa9-d5a4-452c-bf14-f93d58af99dc\") " pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:37.613348 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.613320 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-snjtl\"" Apr 16 14:00:37.621288 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.621268 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bhkl" Apr 16 14:00:37.682693 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.682656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:37.685279 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.685254 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:37.695471 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.695449 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:37.706473 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.706444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zjbf\" (UniqueName: \"kubernetes.io/projected/0020e0fe-4923-4ecf-86ba-90de98fb3649-kube-api-access-4zjbf\") pod \"network-check-target-bh7db\" (UID: \"0020e0fe-4923-4ecf-86ba-90de98fb3649\") " pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:37.736950 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.736929 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7bhkl"] Apr 16 14:00:37.739406 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:37.739381 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59442fa9_d5a4_452c_bf14_f93d58af99dc.slice/crio-a82d9122b88c2c2ff5d2a611b5e3db9a29f61f330dbf929e6ef4f62b52e0b485 WatchSource:0}: Error finding container a82d9122b88c2c2ff5d2a611b5e3db9a29f61f330dbf929e6ef4f62b52e0b485: Status 404 returned error can't find the container with id a82d9122b88c2c2ff5d2a611b5e3db9a29f61f330dbf929e6ef4f62b52e0b485 Apr 16 14:00:37.910237 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.910191 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcmqf\"" Apr 16 14:00:37.918386 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:37.918371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:38.034018 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:38.033996 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bh7db"] Apr 16 14:00:38.036381 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:38.036355 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0020e0fe_4923_4ecf_86ba_90de98fb3649.slice/crio-60ccb9086ec9d9a9ddb82cfe1ea5cfb267f4711e8b6f381e0fe26a8a2902c613 WatchSource:0}: Error finding container 60ccb9086ec9d9a9ddb82cfe1ea5cfb267f4711e8b6f381e0fe26a8a2902c613: Status 404 returned error can't find the container with id 60ccb9086ec9d9a9ddb82cfe1ea5cfb267f4711e8b6f381e0fe26a8a2902c613 Apr 16 14:00:38.187280 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:38.187205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bh7db" event={"ID":"0020e0fe-4923-4ecf-86ba-90de98fb3649","Type":"ContainerStarted","Data":"60ccb9086ec9d9a9ddb82cfe1ea5cfb267f4711e8b6f381e0fe26a8a2902c613"} Apr 16 14:00:38.188186 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:38.188161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7bhkl" event={"ID":"59442fa9-d5a4-452c-bf14-f93d58af99dc","Type":"ContainerStarted","Data":"a82d9122b88c2c2ff5d2a611b5e3db9a29f61f330dbf929e6ef4f62b52e0b485"} Apr 16 14:00:38.485415 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:38.485389 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:38.489942 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:38.489922 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:00:39.196923 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:39.196886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7bhkl" event={"ID":"59442fa9-d5a4-452c-bf14-f93d58af99dc","Type":"ContainerStarted","Data":"af9a9df7ea29477ed6c06855bb522245a911a98d13c82852e1b5a1933a35674e"} Apr 16 14:00:39.197375 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:39.196930 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7bhkl" event={"ID":"59442fa9-d5a4-452c-bf14-f93d58af99dc","Type":"ContainerStarted","Data":"401c6de9d56cc852f4b02f54cdad864fbaa98da535c0c943621b6c67e4b9033e"} Apr 16 14:00:39.214141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:39.214099 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7bhkl" podStartSLOduration=66.119071071 podStartE2EDuration="1m7.214088021s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="2026-04-16 14:00:37.741399296 +0000 UTC m=+66.419906162" lastFinishedPulling="2026-04-16 14:00:38.836416231 +0000 UTC m=+67.514923112" observedRunningTime="2026-04-16 14:00:39.211612401 +0000 UTC m=+67.890119290" watchObservedRunningTime="2026-04-16 14:00:39.214088021 +0000 UTC m=+67.892594909" Apr 16 14:00:41.187396 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:41.187367 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-76df994fb-g25kv" Apr 16 14:00:41.705958 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:41.705925 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65bf79df8-6cjvd"] Apr 16 14:00:42.207979 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:42.207948 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"98b0467e57eb4c107294de05c0e9e818ebd60ff66b82a329c942dd7578754162"} Apr 16 14:00:42.209287 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:42.209265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bh7db" event={"ID":"0020e0fe-4923-4ecf-86ba-90de98fb3649","Type":"ContainerStarted","Data":"c0e8167b572aadb2bf8f537536cb35e4094998e332400238d9021945b9e39070"} Apr 16 14:00:42.209579 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:42.209563 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:00:42.224510 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:42.224460 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bh7db" podStartSLOduration=67.141486669 podStartE2EDuration="1m11.224444023s" podCreationTimestamp="2026-04-16 13:59:31 +0000 UTC" firstStartedPulling="2026-04-16 14:00:38.038206994 +0000 UTC m=+66.716713859" lastFinishedPulling="2026-04-16 14:00:42.121164333 +0000 UTC m=+70.799671213" observedRunningTime="2026-04-16 14:00:42.223739222 +0000 UTC m=+70.902246112" watchObservedRunningTime="2026-04-16 14:00:42.224444023 +0000 UTC m=+70.902950910" Apr 16 14:00:43.215693 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.215661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"3ff5c634c10ea91501bea1600eda88b7c38aea46641d43cc1f367505c8443c11"} Apr 16 14:00:43.216011 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.215701 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"71e575c0e9886a6c43d45ef5b2f6debab82adf248008f9a0996ae22aef7bcc97"} Apr 16 14:00:43.216011 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.215711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"7ce9f5d05cf672c432d499b0541fc4bf7799dce6b4ae0976f85a6eef7a389d26"} Apr 16 14:00:43.216011 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.215720 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"3efa7ff986ad776fa51e09d3db53bdb2246572abcb6429603e625c26bce704cf"} Apr 16 14:00:43.216011 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.215728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e2577532-0f05-482a-8c50-c96bf606f03d","Type":"ContainerStarted","Data":"602353bbae7ec3ef133c753ccd9d605bd8485e7d4dc8ffb9ad5896105753ab5c"} Apr 16 14:00:43.241801 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.241749 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.290906321 podStartE2EDuration="11.241734944s" podCreationTimestamp="2026-04-16 14:00:32 +0000 UTC" firstStartedPulling="2026-04-16 14:00:35.165117335 +0000 UTC m=+63.843624201" lastFinishedPulling="2026-04-16 14:00:42.115945958 +0000 UTC m=+70.794452824" observedRunningTime="2026-04-16 14:00:43.240123231 +0000 UTC m=+71.918630140" watchObservedRunningTime="2026-04-16 14:00:43.241734944 +0000 UTC m=+71.920241831" Apr 16 14:00:43.519728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:43.519702 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:44.183581 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.183557 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-679b4885ff-mzqw6"] Apr 16 14:00:44.220557 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.220530 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679b4885ff-mzqw6"] Apr 16 14:00:44.220874 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.220635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.337880 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.337851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-trusted-ca-bundle\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.338276 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.338254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-oauth-serving-cert\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.338392 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.338325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-oauth-config\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.338471 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.338449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-service-ca\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.338536 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.338479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-config\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.338651 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.338624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whj22\" (UniqueName: \"kubernetes.io/projected/93d35944-6fbd-4cbd-8789-d484b7e5412e-kube-api-access-whj22\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.338762 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.338738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-serving-cert\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.439872 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.439798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-oauth-serving-cert\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.439872 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.439841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-oauth-config\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.439872 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.439864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-service-ca\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.439891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-config\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.439933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whj22\" (UniqueName: \"kubernetes.io/projected/93d35944-6fbd-4cbd-8789-d484b7e5412e-kube-api-access-whj22\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.439980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-serving-cert\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.440031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-trusted-ca-bundle\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440554 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.440528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-oauth-serving-cert\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440649 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.440577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-service-ca\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.440873 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.440828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-config\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.451827 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.451801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-serving-cert\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.451827 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.451825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-oauth-config\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.451972 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.451952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-trusted-ca-bundle\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.453700 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.453684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whj22\" (UniqueName: \"kubernetes.io/projected/93d35944-6fbd-4cbd-8789-d484b7e5412e-kube-api-access-whj22\") pod \"console-679b4885ff-mzqw6\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.529717 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.529692 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:44.649425 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:44.649395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679b4885ff-mzqw6"] Apr 16 14:00:44.653448 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:00:44.653416 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93d35944_6fbd_4cbd_8789_d484b7e5412e.slice/crio-670fc831e0542fc22032090b869476948e532211554007f9e1275c59d2d94414 WatchSource:0}: Error finding container 670fc831e0542fc22032090b869476948e532211554007f9e1275c59d2d94414: Status 404 returned error can't find the container with id 670fc831e0542fc22032090b869476948e532211554007f9e1275c59d2d94414 Apr 16 14:00:45.222492 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:45.222455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b4885ff-mzqw6" event={"ID":"93d35944-6fbd-4cbd-8789-d484b7e5412e","Type":"ContainerStarted","Data":"bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20"} Apr 16 14:00:45.222873 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:45.222499 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b4885ff-mzqw6" event={"ID":"93d35944-6fbd-4cbd-8789-d484b7e5412e","Type":"ContainerStarted","Data":"670fc831e0542fc22032090b869476948e532211554007f9e1275c59d2d94414"} Apr 16 14:00:45.242796 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:45.242755 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-679b4885ff-mzqw6" podStartSLOduration=1.242740537 podStartE2EDuration="1.242740537s" podCreationTimestamp="2026-04-16 14:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:45.241621865 +0000 UTC m=+73.920128754" watchObservedRunningTime="2026-04-16 14:00:45.242740537 +0000 UTC m=+73.921247425" Apr 16 14:00:51.067210 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:51.067093 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:51.067210 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:51.067173 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:00:54.530058 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:54.530028 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:54.530461 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:54.530099 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:54.534839 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:54.534818 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:00:55.254709 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:00:55.254684 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:01:06.727993 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:06.727924 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65bf79df8-6cjvd" podUID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" containerName="console" containerID="cri-o://cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081" gracePeriod=15 Apr 16 14:01:06.965944 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:06.965923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65bf79df8-6cjvd_1d2408a9-8a08-4dce-83b6-d4e4cfefbb01/console/0.log" Apr 16 14:01:06.966054 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:06.965983 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:01:07.092927 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.092903 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdmmp\" (UniqueName: \"kubernetes.io/projected/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-kube-api-access-pdmmp\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093045 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.092951 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-config\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093045 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.092967 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-oauth-serving-cert\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093045 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.092985 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-serving-cert\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093045 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093019 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-oauth-config\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093227 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093052 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-trusted-ca-bundle\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093227 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093068 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-service-ca\") pod \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\" (UID: \"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01\") " Apr 16 14:01:07.093483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093312 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:07.093483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093374 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-config" (OuterVolumeSpecName: "console-config") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:07.093629 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093578 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-service-ca" (OuterVolumeSpecName: "service-ca") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:07.093714 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.093691 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:07.095315 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.095273 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:07.095483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.095328 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-kube-api-access-pdmmp" (OuterVolumeSpecName: "kube-api-access-pdmmp") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "kube-api-access-pdmmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:07.095483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.095367 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" (UID: "1d2408a9-8a08-4dce-83b6-d4e4cfefbb01"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:07.194379 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194352 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-config\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.194379 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194378 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-oauth-serving-cert\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.194527 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194388 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-serving-cert\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.194527 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194398 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-console-oauth-config\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.194527 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194407 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-trusted-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.194527 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194416 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-service-ca\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.194527 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.194428 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdmmp\" (UniqueName: \"kubernetes.io/projected/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01-kube-api-access-pdmmp\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:01:07.285960 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.285939 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65bf79df8-6cjvd_1d2408a9-8a08-4dce-83b6-d4e4cfefbb01/console/0.log" Apr 16 14:01:07.286079 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.285974 2572 generic.go:358] "Generic (PLEG): container finished" podID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" containerID="cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081" exitCode=2 Apr 16 14:01:07.286079 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.286005 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65bf79df8-6cjvd" event={"ID":"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01","Type":"ContainerDied","Data":"cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081"} Apr 16 14:01:07.286079 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.286045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65bf79df8-6cjvd" event={"ID":"1d2408a9-8a08-4dce-83b6-d4e4cfefbb01","Type":"ContainerDied","Data":"4bbd2642fdf2929501a6788938da8ab04f42660a3abd1e347f55fb01d189ad3d"} Apr 16 14:01:07.286079 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.286056 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65bf79df8-6cjvd" Apr 16 14:01:07.286079 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.286070 2572 scope.go:117] "RemoveContainer" containerID="cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081" Apr 16 14:01:07.295143 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.295125 2572 scope.go:117] "RemoveContainer" containerID="cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081" Apr 16 14:01:07.295430 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:01:07.295409 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081\": container with ID starting with cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081 not found: ID does not exist" containerID="cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081" Apr 16 14:01:07.295480 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.295438 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081"} err="failed to get container status \"cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081\": rpc error: code = NotFound desc = could not find container \"cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081\": container with ID starting with cafe1adcbfb56069100a39c6044b9b1fd83b9b138defd3e1f5d10e8db7929081 not found: ID does not exist" Apr 16 14:01:07.306828 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.306802 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65bf79df8-6cjvd"] Apr 16 14:01:07.311501 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.311482 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65bf79df8-6cjvd"] Apr 16 14:01:07.903032 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:07.903005 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" path="/var/lib/kubelet/pods/1d2408a9-8a08-4dce-83b6-d4e4cfefbb01/volumes" Apr 16 14:01:11.072787 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:11.072755 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:01:11.076924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:11.076902 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8769677c6-4zv5d" Apr 16 14:01:13.217930 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:13.217898 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bh7db" Apr 16 14:01:33.519657 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:33.519622 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:33.539233 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:33.539206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:34.388625 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:01:34.388599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:09.926160 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:09.926124 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-679b4885ff-mzqw6"] Apr 16 14:02:34.945833 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:34.945769 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-679b4885ff-mzqw6" podUID="93d35944-6fbd-4cbd-8789-d484b7e5412e" containerName="console" containerID="cri-o://bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20" gracePeriod=15 Apr 16 14:02:35.187660 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.187639 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-679b4885ff-mzqw6_93d35944-6fbd-4cbd-8789-d484b7e5412e/console/0.log" Apr 16 14:02:35.187763 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.187696 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:02:35.230826 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.230763 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-service-ca\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.230826 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.230802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-oauth-serving-cert\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.230997 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.230843 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-oauth-config\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.230997 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.230866 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-serving-cert\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.230997 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.230897 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-trusted-ca-bundle\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.230997 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.230951 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whj22\" (UniqueName: \"kubernetes.io/projected/93d35944-6fbd-4cbd-8789-d484b7e5412e-kube-api-access-whj22\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.231184 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.231010 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-config\") pod \"93d35944-6fbd-4cbd-8789-d484b7e5412e\" (UID: \"93d35944-6fbd-4cbd-8789-d484b7e5412e\") " Apr 16 14:02:35.231232 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.231191 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-service-ca" (OuterVolumeSpecName: "service-ca") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:35.231285 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.231235 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:35.231384 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.231300 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:35.231568 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.231547 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-config" (OuterVolumeSpecName: "console-config") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:35.233001 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.232971 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.233109 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.233083 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.233167 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.233140 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d35944-6fbd-4cbd-8789-d484b7e5412e-kube-api-access-whj22" (OuterVolumeSpecName: "kube-api-access-whj22") pod "93d35944-6fbd-4cbd-8789-d484b7e5412e" (UID: "93d35944-6fbd-4cbd-8789-d484b7e5412e"). InnerVolumeSpecName "kube-api-access-whj22". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:35.331775 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331753 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-serving-cert\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.331775 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331774 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-trusted-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.331897 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331783 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-whj22\" (UniqueName: \"kubernetes.io/projected/93d35944-6fbd-4cbd-8789-d484b7e5412e-kube-api-access-whj22\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.331897 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331794 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-config\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.331897 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331803 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-service-ca\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.331897 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331811 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93d35944-6fbd-4cbd-8789-d484b7e5412e-oauth-serving-cert\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.331897 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.331820 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93d35944-6fbd-4cbd-8789-d484b7e5412e-console-oauth-config\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.543978 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.543959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-679b4885ff-mzqw6_93d35944-6fbd-4cbd-8789-d484b7e5412e/console/0.log" Apr 16 14:02:35.544087 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.543995 2572 generic.go:358] "Generic (PLEG): container finished" podID="93d35944-6fbd-4cbd-8789-d484b7e5412e" containerID="bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20" exitCode=2 Apr 16 14:02:35.544087 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.544031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b4885ff-mzqw6" event={"ID":"93d35944-6fbd-4cbd-8789-d484b7e5412e","Type":"ContainerDied","Data":"bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20"} Apr 16 14:02:35.544087 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.544060 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b4885ff-mzqw6" Apr 16 14:02:35.544087 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.544078 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b4885ff-mzqw6" event={"ID":"93d35944-6fbd-4cbd-8789-d484b7e5412e","Type":"ContainerDied","Data":"670fc831e0542fc22032090b869476948e532211554007f9e1275c59d2d94414"} Apr 16 14:02:35.544272 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.544102 2572 scope.go:117] "RemoveContainer" containerID="bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20" Apr 16 14:02:35.552244 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.552225 2572 scope.go:117] "RemoveContainer" containerID="bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20" Apr 16 14:02:35.552504 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:02:35.552487 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20\": container with ID starting with bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20 not found: ID does not exist" containerID="bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20" Apr 16 14:02:35.552558 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.552510 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20"} err="failed to get container status \"bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20\": rpc error: code = NotFound desc = could not find container \"bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20\": container with ID starting with bb032934ea9824814f607e5e082ee1e1513334202436267ed07412846bf99e20 not found: ID does not exist" Apr 16 14:02:35.563929 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.563909 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-679b4885ff-mzqw6"] Apr 16 14:02:35.569932 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.569910 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-679b4885ff-mzqw6"] Apr 16 14:02:35.902937 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:35.902866 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d35944-6fbd-4cbd-8789-d484b7e5412e" path="/var/lib/kubelet/pods/93d35944-6fbd-4cbd-8789-d484b7e5412e/volumes" Apr 16 14:02:37.684495 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684462 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-444dx"] Apr 16 14:02:37.684870 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684771 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" containerName="console" Apr 16 14:02:37.684870 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684782 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" containerName="console" Apr 16 14:02:37.684870 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684794 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93d35944-6fbd-4cbd-8789-d484b7e5412e" containerName="console" Apr 16 14:02:37.684870 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684801 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d35944-6fbd-4cbd-8789-d484b7e5412e" containerName="console" Apr 16 14:02:37.684870 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684849 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="93d35944-6fbd-4cbd-8789-d484b7e5412e" containerName="console" Apr 16 14:02:37.684870 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.684858 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d2408a9-8a08-4dce-83b6-d4e4cfefbb01" containerName="console" Apr 16 14:02:37.689057 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.689036 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.691443 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.691420 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:02:37.694281 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.694242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-444dx"] Apr 16 14:02:37.750039 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.750016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38192d70-e6e7-4372-ae09-7af27ef748e6-kubelet-config\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.750124 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.750056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38192d70-e6e7-4372-ae09-7af27ef748e6-original-pull-secret\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.750177 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.750143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38192d70-e6e7-4372-ae09-7af27ef748e6-dbus\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.850874 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.850843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38192d70-e6e7-4372-ae09-7af27ef748e6-dbus\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.850984 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.850891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38192d70-e6e7-4372-ae09-7af27ef748e6-kubelet-config\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.850984 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.850943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38192d70-e6e7-4372-ae09-7af27ef748e6-original-pull-secret\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.851096 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.851028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38192d70-e6e7-4372-ae09-7af27ef748e6-dbus\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.851096 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.851033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38192d70-e6e7-4372-ae09-7af27ef748e6-kubelet-config\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.853483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.853463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38192d70-e6e7-4372-ae09-7af27ef748e6-original-pull-secret\") pod \"global-pull-secret-syncer-444dx\" (UID: \"38192d70-e6e7-4372-ae09-7af27ef748e6\") " pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:37.999008 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:37.998983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-444dx" Apr 16 14:02:38.116256 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:38.116230 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-444dx"] Apr 16 14:02:38.118874 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:02:38.118843 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38192d70_e6e7_4372_ae09_7af27ef748e6.slice/crio-23be32e312680dc3b21e66ef6460e90593877ed91af1ce000aba453cf95b24e6 WatchSource:0}: Error finding container 23be32e312680dc3b21e66ef6460e90593877ed91af1ce000aba453cf95b24e6: Status 404 returned error can't find the container with id 23be32e312680dc3b21e66ef6460e90593877ed91af1ce000aba453cf95b24e6 Apr 16 14:02:38.554553 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:38.554525 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-444dx" event={"ID":"38192d70-e6e7-4372-ae09-7af27ef748e6","Type":"ContainerStarted","Data":"23be32e312680dc3b21e66ef6460e90593877ed91af1ce000aba453cf95b24e6"} Apr 16 14:02:44.573553 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:44.573477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-444dx" event={"ID":"38192d70-e6e7-4372-ae09-7af27ef748e6","Type":"ContainerStarted","Data":"623b8b5b72b9eb3813760ad216b7ad275cd64c458b4152b1ac373a4d3259a423"} Apr 16 14:02:44.587891 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:02:44.587841 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-444dx" podStartSLOduration=1.436789674 podStartE2EDuration="7.587829174s" podCreationTimestamp="2026-04-16 14:02:37 +0000 UTC" firstStartedPulling="2026-04-16 14:02:38.120581967 +0000 UTC m=+186.799088833" lastFinishedPulling="2026-04-16 14:02:44.271621465 +0000 UTC m=+192.950128333" observedRunningTime="2026-04-16 14:02:44.587210747 +0000 UTC m=+193.265717636" watchObservedRunningTime="2026-04-16 14:02:44.587829174 +0000 UTC m=+193.266336061" Apr 16 14:04:31.795409 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:04:31.795377 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:05:26.054735 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.054701 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dsm66"] Apr 16 14:05:26.058751 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.058720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.061207 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.061169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 14:05:26.061459 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.061441 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-md4nm\"" Apr 16 14:05:26.062528 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.062507 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:05:26.062653 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.062622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:05:26.064546 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.064521 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dsm66"] Apr 16 14:05:26.090703 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.090678 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-9tc42"] Apr 16 14:05:26.092833 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.092819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.095224 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.095201 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5rcrv\"" Apr 16 14:05:26.095357 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.095266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 14:05:26.102804 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.102782 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-9tc42"] Apr 16 14:05:26.136680 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.136660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.136772 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.136694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8939169a-f799-4397-b932-fc30821c51b2-data\") pod \"seaweedfs-86cc847c5c-9tc42\" (UID: \"8939169a-f799-4397-b932-fc30821c51b2\") " pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.136772 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.136753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x64r6\" (UniqueName: \"kubernetes.io/projected/307f7be9-41ec-4eb2-803b-a653931489fe-kube-api-access-x64r6\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.136847 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.136788 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/8939169a-f799-4397-b932-fc30821c51b2-kube-api-access-tr27r\") pod \"seaweedfs-86cc847c5c-9tc42\" (UID: \"8939169a-f799-4397-b932-fc30821c51b2\") " pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.237965 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.237940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.238071 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.237974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8939169a-f799-4397-b932-fc30821c51b2-data\") pod \"seaweedfs-86cc847c5c-9tc42\" (UID: \"8939169a-f799-4397-b932-fc30821c51b2\") " pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.238071 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.238001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x64r6\" (UniqueName: \"kubernetes.io/projected/307f7be9-41ec-4eb2-803b-a653931489fe-kube-api-access-x64r6\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.238182 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:05:26.238094 2572 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 14:05:26.238182 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.238143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/8939169a-f799-4397-b932-fc30821c51b2-kube-api-access-tr27r\") pod \"seaweedfs-86cc847c5c-9tc42\" (UID: \"8939169a-f799-4397-b932-fc30821c51b2\") " pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.238182 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:05:26.238165 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert podName:307f7be9-41ec-4eb2-803b-a653931489fe nodeName:}" failed. No retries permitted until 2026-04-16 14:05:26.73814467 +0000 UTC m=+355.416651564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert") pod "kserve-controller-manager-75d667c7c4-dsm66" (UID: "307f7be9-41ec-4eb2-803b-a653931489fe") : secret "kserve-webhook-server-cert" not found Apr 16 14:05:26.238316 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.238294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8939169a-f799-4397-b932-fc30821c51b2-data\") pod \"seaweedfs-86cc847c5c-9tc42\" (UID: \"8939169a-f799-4397-b932-fc30821c51b2\") " pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.255086 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.255059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x64r6\" (UniqueName: \"kubernetes.io/projected/307f7be9-41ec-4eb2-803b-a653931489fe-kube-api-access-x64r6\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.255221 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.255207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr27r\" (UniqueName: \"kubernetes.io/projected/8939169a-f799-4397-b932-fc30821c51b2-kube-api-access-tr27r\") pod \"seaweedfs-86cc847c5c-9tc42\" (UID: \"8939169a-f799-4397-b932-fc30821c51b2\") " pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.402932 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.402867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:26.516511 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.516482 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-9tc42"] Apr 16 14:05:26.519944 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:05:26.519919 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8939169a_f799_4397_b932_fc30821c51b2.slice/crio-2114d45d06f8a9ed750a36aede1d2bbbf1f830a723324bfb85d1bc3ed870028e WatchSource:0}: Error finding container 2114d45d06f8a9ed750a36aede1d2bbbf1f830a723324bfb85d1bc3ed870028e: Status 404 returned error can't find the container with id 2114d45d06f8a9ed750a36aede1d2bbbf1f830a723324bfb85d1bc3ed870028e Apr 16 14:05:26.521193 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.521171 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:05:26.742353 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.742308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.744699 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.744682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert\") pod \"kserve-controller-manager-75d667c7c4-dsm66\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:26.969846 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:26.969818 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:27.038667 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:27.038604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-9tc42" event={"ID":"8939169a-f799-4397-b932-fc30821c51b2","Type":"ContainerStarted","Data":"2114d45d06f8a9ed750a36aede1d2bbbf1f830a723324bfb85d1bc3ed870028e"} Apr 16 14:05:27.110350 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:27.110299 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dsm66"] Apr 16 14:05:27.114584 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:05:27.114554 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod307f7be9_41ec_4eb2_803b_a653931489fe.slice/crio-6dea639532b5752584acbb8d587b4a6537ef5e944b561a5333b9f8481b8d240a WatchSource:0}: Error finding container 6dea639532b5752584acbb8d587b4a6537ef5e944b561a5333b9f8481b8d240a: Status 404 returned error can't find the container with id 6dea639532b5752584acbb8d587b4a6537ef5e944b561a5333b9f8481b8d240a Apr 16 14:05:28.041949 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:28.041902 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" event={"ID":"307f7be9-41ec-4eb2-803b-a653931489fe","Type":"ContainerStarted","Data":"6dea639532b5752584acbb8d587b4a6537ef5e944b561a5333b9f8481b8d240a"} Apr 16 14:05:30.049163 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:30.049128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-9tc42" event={"ID":"8939169a-f799-4397-b932-fc30821c51b2","Type":"ContainerStarted","Data":"a77b6deb91bfcf6ef541ebd53c031f04e5254c1bff92b13697a8102c7638162d"} Apr 16 14:05:30.049618 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:30.049271 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:05:30.065633 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:30.065582 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-9tc42" podStartSLOduration=1.472381728 podStartE2EDuration="4.065565739s" podCreationTimestamp="2026-04-16 14:05:26 +0000 UTC" firstStartedPulling="2026-04-16 14:05:26.52132505 +0000 UTC m=+355.199831916" lastFinishedPulling="2026-04-16 14:05:29.114509053 +0000 UTC m=+357.793015927" observedRunningTime="2026-04-16 14:05:30.063425053 +0000 UTC m=+358.741931983" watchObservedRunningTime="2026-04-16 14:05:30.065565739 +0000 UTC m=+358.744072629" Apr 16 14:05:32.055591 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:32.055556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" event={"ID":"307f7be9-41ec-4eb2-803b-a653931489fe","Type":"ContainerStarted","Data":"607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf"} Apr 16 14:05:32.055928 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:32.055705 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:05:32.070939 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:32.070876 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" podStartSLOduration=1.540611875 podStartE2EDuration="6.070857917s" podCreationTimestamp="2026-04-16 14:05:26 +0000 UTC" firstStartedPulling="2026-04-16 14:05:27.116635352 +0000 UTC m=+355.795142225" lastFinishedPulling="2026-04-16 14:05:31.646881401 +0000 UTC m=+360.325388267" observedRunningTime="2026-04-16 14:05:32.070591175 +0000 UTC m=+360.749098064" watchObservedRunningTime="2026-04-16 14:05:32.070857917 +0000 UTC m=+360.749364806" Apr 16 14:05:36.054799 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:05:36.054770 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-9tc42" Apr 16 14:06:00.800275 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.800237 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69cf668bdb-nnwqf"] Apr 16 14:06:00.805774 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.805752 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.808622 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.808599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:06:00.809786 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.809765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:06:00.809878 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.809787 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:06:00.809878 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.809829 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:06:00.809878 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.809829 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:06:00.810497 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.810280 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sb62k\"" Apr 16 14:06:00.810681 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.810662 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:06:00.811790 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.811772 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:06:00.813811 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.813788 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69cf668bdb-nnwqf"] Apr 16 14:06:00.814656 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.814638 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:06:00.892556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-config\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.892642 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwdc\" (UniqueName: \"kubernetes.io/projected/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-kube-api-access-wlwdc\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.892642 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-serving-cert\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.892642 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-oauth-serving-cert\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.892745 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-trusted-ca-bundle\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.892778 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-oauth-config\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.892810 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.892776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-service-ca\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.993823 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.993804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-service-ca\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.993913 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.993878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-config\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.993913 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.993901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwdc\" (UniqueName: \"kubernetes.io/projected/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-kube-api-access-wlwdc\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994219 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-serving-cert\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994219 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-oauth-serving-cert\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994219 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-trusted-ca-bundle\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994442 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-oauth-config\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994632 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-service-ca\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994726 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-config\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994776 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-oauth-serving-cert\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.994937 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.994917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-trusted-ca-bundle\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.996761 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.996741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-serving-cert\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:00.996857 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:00.996839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-console-oauth-config\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:01.001942 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:01.001924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwdc\" (UniqueName: \"kubernetes.io/projected/7aa05115-7b88-45e3-b8e3-21cda1b7e7cf-kube-api-access-wlwdc\") pod \"console-69cf668bdb-nnwqf\" (UID: \"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf\") " pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:01.117215 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:01.117160 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:01.237595 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:01.237570 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69cf668bdb-nnwqf"] Apr 16 14:06:01.240404 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:06:01.240376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa05115_7b88_45e3_b8e3_21cda1b7e7cf.slice/crio-d90eda8c07b2995a2c88c41ebcc1ffe597458af729416b28955bc883fc712451 WatchSource:0}: Error finding container d90eda8c07b2995a2c88c41ebcc1ffe597458af729416b28955bc883fc712451: Status 404 returned error can't find the container with id d90eda8c07b2995a2c88c41ebcc1ffe597458af729416b28955bc883fc712451 Apr 16 14:06:02.112386 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.112350 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dsm66"] Apr 16 14:06:02.112783 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.112601 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" podUID="307f7be9-41ec-4eb2-803b-a653931489fe" containerName="manager" containerID="cri-o://607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf" gracePeriod=10 Apr 16 14:06:02.117634 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.117602 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:06:02.133923 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.133898 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-z4dcx"] Apr 16 14:06:02.137251 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.137233 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.137460 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.137441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69cf668bdb-nnwqf" event={"ID":"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf","Type":"ContainerStarted","Data":"adfddb9f185eb6be661b262140ceb5c1ad892cd3a86242f199e74a074f533086"} Apr 16 14:06:02.137514 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.137468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69cf668bdb-nnwqf" event={"ID":"7aa05115-7b88-45e3-b8e3-21cda1b7e7cf","Type":"ContainerStarted","Data":"d90eda8c07b2995a2c88c41ebcc1ffe597458af729416b28955bc883fc712451"} Apr 16 14:06:02.143402 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.143381 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-z4dcx"] Apr 16 14:06:02.174037 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.173999 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69cf668bdb-nnwqf" podStartSLOduration=2.17398394 podStartE2EDuration="2.17398394s" podCreationTimestamp="2026-04-16 14:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:06:02.172409765 +0000 UTC m=+390.850916654" watchObservedRunningTime="2026-04-16 14:06:02.17398394 +0000 UTC m=+390.852490830" Apr 16 14:06:02.202690 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.202667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ff749b3-9a80-4bc3-80b4-276093aebd9d-cert\") pod \"kserve-controller-manager-75d667c7c4-z4dcx\" (UID: \"7ff749b3-9a80-4bc3-80b4-276093aebd9d\") " pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.202816 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.202800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgd2x\" (UniqueName: \"kubernetes.io/projected/7ff749b3-9a80-4bc3-80b4-276093aebd9d-kube-api-access-sgd2x\") pod \"kserve-controller-manager-75d667c7c4-z4dcx\" (UID: \"7ff749b3-9a80-4bc3-80b4-276093aebd9d\") " pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.303844 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.303812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ff749b3-9a80-4bc3-80b4-276093aebd9d-cert\") pod \"kserve-controller-manager-75d667c7c4-z4dcx\" (UID: \"7ff749b3-9a80-4bc3-80b4-276093aebd9d\") " pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.303989 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.303878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgd2x\" (UniqueName: \"kubernetes.io/projected/7ff749b3-9a80-4bc3-80b4-276093aebd9d-kube-api-access-sgd2x\") pod \"kserve-controller-manager-75d667c7c4-z4dcx\" (UID: \"7ff749b3-9a80-4bc3-80b4-276093aebd9d\") " pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.306435 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.306411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ff749b3-9a80-4bc3-80b4-276093aebd9d-cert\") pod \"kserve-controller-manager-75d667c7c4-z4dcx\" (UID: \"7ff749b3-9a80-4bc3-80b4-276093aebd9d\") " pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.311868 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.311840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgd2x\" (UniqueName: \"kubernetes.io/projected/7ff749b3-9a80-4bc3-80b4-276093aebd9d-kube-api-access-sgd2x\") pod \"kserve-controller-manager-75d667c7c4-z4dcx\" (UID: \"7ff749b3-9a80-4bc3-80b4-276093aebd9d\") " pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.341025 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.341005 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:06:02.404188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.404128 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert\") pod \"307f7be9-41ec-4eb2-803b-a653931489fe\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " Apr 16 14:06:02.404299 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.404196 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x64r6\" (UniqueName: \"kubernetes.io/projected/307f7be9-41ec-4eb2-803b-a653931489fe-kube-api-access-x64r6\") pod \"307f7be9-41ec-4eb2-803b-a653931489fe\" (UID: \"307f7be9-41ec-4eb2-803b-a653931489fe\") " Apr 16 14:06:02.406289 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.406267 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert" (OuterVolumeSpecName: "cert") pod "307f7be9-41ec-4eb2-803b-a653931489fe" (UID: "307f7be9-41ec-4eb2-803b-a653931489fe"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:06:02.406357 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.406295 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307f7be9-41ec-4eb2-803b-a653931489fe-kube-api-access-x64r6" (OuterVolumeSpecName: "kube-api-access-x64r6") pod "307f7be9-41ec-4eb2-803b-a653931489fe" (UID: "307f7be9-41ec-4eb2-803b-a653931489fe"). InnerVolumeSpecName "kube-api-access-x64r6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:06:02.492681 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.492631 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:02.505685 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.505658 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x64r6\" (UniqueName: \"kubernetes.io/projected/307f7be9-41ec-4eb2-803b-a653931489fe-kube-api-access-x64r6\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:06:02.505685 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.505686 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/307f7be9-41ec-4eb2-803b-a653931489fe-cert\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:06:02.614296 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:02.614273 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-z4dcx"] Apr 16 14:06:02.616929 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:06:02.616900 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff749b3_9a80_4bc3_80b4_276093aebd9d.slice/crio-0083e5e79b6762eeab0180f2e269bd66540b66941300c46998f914c3c85ab166 WatchSource:0}: Error finding container 0083e5e79b6762eeab0180f2e269bd66540b66941300c46998f914c3c85ab166: Status 404 returned error can't find the container with id 0083e5e79b6762eeab0180f2e269bd66540b66941300c46998f914c3c85ab166 Apr 16 14:06:03.141576 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.141546 2572 generic.go:358] "Generic (PLEG): container finished" podID="307f7be9-41ec-4eb2-803b-a653931489fe" containerID="607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf" exitCode=0 Apr 16 14:06:03.141928 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.141621 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" Apr 16 14:06:03.141928 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.141637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" event={"ID":"307f7be9-41ec-4eb2-803b-a653931489fe","Type":"ContainerDied","Data":"607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf"} Apr 16 14:06:03.141928 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.141679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dsm66" event={"ID":"307f7be9-41ec-4eb2-803b-a653931489fe","Type":"ContainerDied","Data":"6dea639532b5752584acbb8d587b4a6537ef5e944b561a5333b9f8481b8d240a"} Apr 16 14:06:03.141928 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.141701 2572 scope.go:117] "RemoveContainer" containerID="607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf" Apr 16 14:06:03.142799 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.142768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" event={"ID":"7ff749b3-9a80-4bc3-80b4-276093aebd9d","Type":"ContainerStarted","Data":"0083e5e79b6762eeab0180f2e269bd66540b66941300c46998f914c3c85ab166"} Apr 16 14:06:03.149724 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.149705 2572 scope.go:117] "RemoveContainer" containerID="607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf" Apr 16 14:06:03.149985 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:06:03.149962 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf\": container with ID starting with 607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf not found: ID does not exist" containerID="607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf" Apr 16 14:06:03.150038 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.149996 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf"} err="failed to get container status \"607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf\": rpc error: code = NotFound desc = could not find container \"607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf\": container with ID starting with 607680c609d4d03d863c282e4816bb63de36ab08fb540e3782a24a2436ce9eaf not found: ID does not exist" Apr 16 14:06:03.161734 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.161712 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dsm66"] Apr 16 14:06:03.165022 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.165003 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dsm66"] Apr 16 14:06:03.903234 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:03.903203 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307f7be9-41ec-4eb2-803b-a653931489fe" path="/var/lib/kubelet/pods/307f7be9-41ec-4eb2-803b-a653931489fe/volumes" Apr 16 14:06:04.148079 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:04.148007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" event={"ID":"7ff749b3-9a80-4bc3-80b4-276093aebd9d","Type":"ContainerStarted","Data":"b0450c651068a9f56b637048e5cd0e1a474193b80dafd9baa775b168bdd10179"} Apr 16 14:06:04.148441 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:04.148234 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:04.163965 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:04.163904 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" podStartSLOduration=1.040062743 podStartE2EDuration="2.163892267s" podCreationTimestamp="2026-04-16 14:06:02 +0000 UTC" firstStartedPulling="2026-04-16 14:06:02.618428438 +0000 UTC m=+391.296935305" lastFinishedPulling="2026-04-16 14:06:03.742257961 +0000 UTC m=+392.420764829" observedRunningTime="2026-04-16 14:06:04.163517301 +0000 UTC m=+392.842024190" watchObservedRunningTime="2026-04-16 14:06:04.163892267 +0000 UTC m=+392.842399154" Apr 16 14:06:11.117907 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:11.117852 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:11.118440 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:11.117939 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:11.123591 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:11.123567 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:11.172567 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:11.172546 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69cf668bdb-nnwqf" Apr 16 14:06:35.156474 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:35.156444 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-z4dcx" Apr 16 14:06:52.969041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:52.968956 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-h6gqg"] Apr 16 14:06:52.969501 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:52.969323 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="307f7be9-41ec-4eb2-803b-a653931489fe" containerName="manager" Apr 16 14:06:52.969501 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:52.969365 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="307f7be9-41ec-4eb2-803b-a653931489fe" containerName="manager" Apr 16 14:06:52.969574 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:52.969507 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="307f7be9-41ec-4eb2-803b-a653931489fe" containerName="manager" Apr 16 14:06:52.972545 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:52.972528 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h6gqg" Apr 16 14:06:52.980364 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:52.980324 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-h6gqg"] Apr 16 14:06:53.066558 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:53.066526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kzxl\" (UniqueName: \"kubernetes.io/projected/3a5e33de-9ecc-4979-bdd4-32c01cf17ac5-kube-api-access-8kzxl\") pod \"s3-init-h6gqg\" (UID: \"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5\") " pod="kserve/s3-init-h6gqg" Apr 16 14:06:53.167856 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:53.167820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kzxl\" (UniqueName: \"kubernetes.io/projected/3a5e33de-9ecc-4979-bdd4-32c01cf17ac5-kube-api-access-8kzxl\") pod \"s3-init-h6gqg\" (UID: \"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5\") " pod="kserve/s3-init-h6gqg" Apr 16 14:06:53.188211 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:53.188190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kzxl\" (UniqueName: \"kubernetes.io/projected/3a5e33de-9ecc-4979-bdd4-32c01cf17ac5-kube-api-access-8kzxl\") pod \"s3-init-h6gqg\" (UID: \"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5\") " pod="kserve/s3-init-h6gqg" Apr 16 14:06:53.292987 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:53.292968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h6gqg" Apr 16 14:06:53.413044 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:53.413020 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-h6gqg"] Apr 16 14:06:53.415751 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:06:53.415726 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a5e33de_9ecc_4979_bdd4_32c01cf17ac5.slice/crio-f7a3e309ec4d008f6684a8b54cb5e916aecf747ec6fbad7e32dda02732ba1ac2 WatchSource:0}: Error finding container f7a3e309ec4d008f6684a8b54cb5e916aecf747ec6fbad7e32dda02732ba1ac2: Status 404 returned error can't find the container with id f7a3e309ec4d008f6684a8b54cb5e916aecf747ec6fbad7e32dda02732ba1ac2 Apr 16 14:06:54.289221 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:54.289182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h6gqg" event={"ID":"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5","Type":"ContainerStarted","Data":"f7a3e309ec4d008f6684a8b54cb5e916aecf747ec6fbad7e32dda02732ba1ac2"} Apr 16 14:06:58.305538 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:58.305497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h6gqg" event={"ID":"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5","Type":"ContainerStarted","Data":"db65725e0ed156d968d1334af8e5c7d9a87a6b3c968768980fbe37f854c4b828"} Apr 16 14:06:58.321242 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:06:58.321200 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-h6gqg" podStartSLOduration=1.905821386 podStartE2EDuration="6.321188744s" podCreationTimestamp="2026-04-16 14:06:52 +0000 UTC" firstStartedPulling="2026-04-16 14:06:53.417580938 +0000 UTC m=+442.096087820" lastFinishedPulling="2026-04-16 14:06:57.832948312 +0000 UTC m=+446.511455178" observedRunningTime="2026-04-16 14:06:58.320076678 +0000 UTC m=+446.998583566" watchObservedRunningTime="2026-04-16 14:06:58.321188744 +0000 UTC m=+446.999695632" Apr 16 14:07:01.316860 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:01.316824 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a5e33de-9ecc-4979-bdd4-32c01cf17ac5" containerID="db65725e0ed156d968d1334af8e5c7d9a87a6b3c968768980fbe37f854c4b828" exitCode=0 Apr 16 14:07:01.317242 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:01.316899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h6gqg" event={"ID":"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5","Type":"ContainerDied","Data":"db65725e0ed156d968d1334af8e5c7d9a87a6b3c968768980fbe37f854c4b828"} Apr 16 14:07:02.446283 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:02.446261 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h6gqg" Apr 16 14:07:02.547981 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:02.547953 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kzxl\" (UniqueName: \"kubernetes.io/projected/3a5e33de-9ecc-4979-bdd4-32c01cf17ac5-kube-api-access-8kzxl\") pod \"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5\" (UID: \"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5\") " Apr 16 14:07:02.550092 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:02.550070 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5e33de-9ecc-4979-bdd4-32c01cf17ac5-kube-api-access-8kzxl" (OuterVolumeSpecName: "kube-api-access-8kzxl") pod "3a5e33de-9ecc-4979-bdd4-32c01cf17ac5" (UID: "3a5e33de-9ecc-4979-bdd4-32c01cf17ac5"). InnerVolumeSpecName "kube-api-access-8kzxl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:02.649106 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:02.649051 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kzxl\" (UniqueName: \"kubernetes.io/projected/3a5e33de-9ecc-4979-bdd4-32c01cf17ac5-kube-api-access-8kzxl\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:07:03.323562 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:03.323532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h6gqg" event={"ID":"3a5e33de-9ecc-4979-bdd4-32c01cf17ac5","Type":"ContainerDied","Data":"f7a3e309ec4d008f6684a8b54cb5e916aecf747ec6fbad7e32dda02732ba1ac2"} Apr 16 14:07:03.323562 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:03.323562 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7a3e309ec4d008f6684a8b54cb5e916aecf747ec6fbad7e32dda02732ba1ac2" Apr 16 14:07:03.323748 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:03.323570 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h6gqg" Apr 16 14:07:12.556815 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.556784 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k"] Apr 16 14:07:12.557250 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.557098 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a5e33de-9ecc-4979-bdd4-32c01cf17ac5" containerName="s3-init" Apr 16 14:07:12.557250 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.557108 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5e33de-9ecc-4979-bdd4-32c01cf17ac5" containerName="s3-init" Apr 16 14:07:12.557250 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.557173 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a5e33de-9ecc-4979-bdd4-32c01cf17ac5" containerName="s3-init" Apr 16 14:07:12.560370 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.560328 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:07:12.562666 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.562646 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4rnlz\"" Apr 16 14:07:12.566835 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.566815 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k"] Apr 16 14:07:12.619758 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.619727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k\" (UID: \"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:07:12.720382 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.720318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k\" (UID: \"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:07:12.720764 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.720744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k\" (UID: \"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:07:12.764611 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.764582 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55"] Apr 16 14:07:12.767962 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.767941 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:07:12.777727 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.777706 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55"] Apr 16 14:07:12.820857 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.820794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6a68343-1673-4124-a36a-176be7f5aa7b-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-4xk55\" (UID: \"f6a68343-1673-4124-a36a-176be7f5aa7b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:07:12.871707 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.871675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:07:12.922258 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.922225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6a68343-1673-4124-a36a-176be7f5aa7b-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-4xk55\" (UID: \"f6a68343-1673-4124-a36a-176be7f5aa7b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:07:12.922618 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.922595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6a68343-1673-4124-a36a-176be7f5aa7b-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-4xk55\" (UID: \"f6a68343-1673-4124-a36a-176be7f5aa7b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:07:12.971417 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.971390 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228"] Apr 16 14:07:12.975907 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.975882 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:07:12.982801 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.982777 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228"] Apr 16 14:07:12.997051 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:12.997032 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k"] Apr 16 14:07:13.000112 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:07:13.000089 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa431d5e_97ef_4a0b_b2f7_9a17ee01716b.slice/crio-59870484d53a29fbfc080a8df4894d09e58ef9a93722ba0ef3d7e451bad1b1de WatchSource:0}: Error finding container 59870484d53a29fbfc080a8df4894d09e58ef9a93722ba0ef3d7e451bad1b1de: Status 404 returned error can't find the container with id 59870484d53a29fbfc080a8df4894d09e58ef9a93722ba0ef3d7e451bad1b1de Apr 16 14:07:13.078429 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.078377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:07:13.123077 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.123039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b6be662-4ac7-4979-9a73-266d1010c89a-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228\" (UID: \"4b6be662-4ac7-4979-9a73-266d1010c89a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:07:13.195878 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.195803 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55"] Apr 16 14:07:13.198396 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:07:13.198366 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a68343_1673_4124_a36a_176be7f5aa7b.slice/crio-f43ee3cb5e6803cdc8f6fed9bb27e287fd47c7770d24dd0c1b9ea84785c069af WatchSource:0}: Error finding container f43ee3cb5e6803cdc8f6fed9bb27e287fd47c7770d24dd0c1b9ea84785c069af: Status 404 returned error can't find the container with id f43ee3cb5e6803cdc8f6fed9bb27e287fd47c7770d24dd0c1b9ea84785c069af Apr 16 14:07:13.224042 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.224021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b6be662-4ac7-4979-9a73-266d1010c89a-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228\" (UID: \"4b6be662-4ac7-4979-9a73-266d1010c89a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:07:13.224314 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.224297 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b6be662-4ac7-4979-9a73-266d1010c89a-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228\" (UID: \"4b6be662-4ac7-4979-9a73-266d1010c89a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:07:13.289933 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.289911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:07:13.353219 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.353181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" event={"ID":"f6a68343-1673-4124-a36a-176be7f5aa7b","Type":"ContainerStarted","Data":"f43ee3cb5e6803cdc8f6fed9bb27e287fd47c7770d24dd0c1b9ea84785c069af"} Apr 16 14:07:13.354218 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.354194 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" event={"ID":"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b","Type":"ContainerStarted","Data":"59870484d53a29fbfc080a8df4894d09e58ef9a93722ba0ef3d7e451bad1b1de"} Apr 16 14:07:13.407121 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:13.407096 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228"] Apr 16 14:07:13.408937 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:07:13.408911 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b6be662_4ac7_4979_9a73_266d1010c89a.slice/crio-310d27f43ff5bf01a14963c17f6a64724d6671d67e265d499a1c2537165c5bf0 WatchSource:0}: Error finding container 310d27f43ff5bf01a14963c17f6a64724d6671d67e265d499a1c2537165c5bf0: Status 404 returned error can't find the container with id 310d27f43ff5bf01a14963c17f6a64724d6671d67e265d499a1c2537165c5bf0 Apr 16 14:07:14.358520 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:14.358483 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" event={"ID":"4b6be662-4ac7-4979-9a73-266d1010c89a","Type":"ContainerStarted","Data":"310d27f43ff5bf01a14963c17f6a64724d6671d67e265d499a1c2537165c5bf0"} Apr 16 14:07:18.376808 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:18.376775 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" event={"ID":"4b6be662-4ac7-4979-9a73-266d1010c89a","Type":"ContainerStarted","Data":"3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a"} Apr 16 14:07:18.378098 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:18.378074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" event={"ID":"f6a68343-1673-4124-a36a-176be7f5aa7b","Type":"ContainerStarted","Data":"c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406"} Apr 16 14:07:18.379301 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:18.379271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" event={"ID":"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b","Type":"ContainerStarted","Data":"9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b"} Apr 16 14:07:21.389979 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:21.389906 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerID="c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406" exitCode=0 Apr 16 14:07:21.390321 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:21.389991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" event={"ID":"f6a68343-1673-4124-a36a-176be7f5aa7b","Type":"ContainerDied","Data":"c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406"} Apr 16 14:07:21.391599 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:21.391578 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerID="9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b" exitCode=0 Apr 16 14:07:21.391696 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:21.391627 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" event={"ID":"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b","Type":"ContainerDied","Data":"9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b"} Apr 16 14:07:21.393637 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:21.393612 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerID="3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a" exitCode=0 Apr 16 14:07:21.393725 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:21.393651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" event={"ID":"4b6be662-4ac7-4979-9a73-266d1010c89a","Type":"ContainerDied","Data":"3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a"} Apr 16 14:07:48.504142 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.504103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" event={"ID":"4b6be662-4ac7-4979-9a73-266d1010c89a","Type":"ContainerStarted","Data":"210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94"} Apr 16 14:07:48.504630 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.504379 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:07:48.505861 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.505837 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" event={"ID":"f6a68343-1673-4124-a36a-176be7f5aa7b","Type":"ContainerStarted","Data":"2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d"} Apr 16 14:07:48.505995 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.505877 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:07:48.506131 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.506108 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:07:48.507253 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.507229 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:07:48.507564 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.507544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" event={"ID":"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b","Type":"ContainerStarted","Data":"4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372"} Apr 16 14:07:48.507826 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.507796 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:07:48.508762 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.508742 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:07:48.521924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.521874 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podStartSLOduration=1.872304381 podStartE2EDuration="36.521860972s" podCreationTimestamp="2026-04-16 14:07:12 +0000 UTC" firstStartedPulling="2026-04-16 14:07:13.410752785 +0000 UTC m=+462.089259650" lastFinishedPulling="2026-04-16 14:07:48.060309369 +0000 UTC m=+496.738816241" observedRunningTime="2026-04-16 14:07:48.519564181 +0000 UTC m=+497.198071068" watchObservedRunningTime="2026-04-16 14:07:48.521860972 +0000 UTC m=+497.200367861" Apr 16 14:07:48.535262 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.535220 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podStartSLOduration=1.477578162 podStartE2EDuration="36.535209775s" podCreationTimestamp="2026-04-16 14:07:12 +0000 UTC" firstStartedPulling="2026-04-16 14:07:13.002689117 +0000 UTC m=+461.681195989" lastFinishedPulling="2026-04-16 14:07:48.060320725 +0000 UTC m=+496.738827602" observedRunningTime="2026-04-16 14:07:48.533596535 +0000 UTC m=+497.212103424" watchObservedRunningTime="2026-04-16 14:07:48.535209775 +0000 UTC m=+497.213716662" Apr 16 14:07:48.554161 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:48.554123 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podStartSLOduration=1.627228235 podStartE2EDuration="36.554109334s" podCreationTimestamp="2026-04-16 14:07:12 +0000 UTC" firstStartedPulling="2026-04-16 14:07:13.200282804 +0000 UTC m=+461.878789669" lastFinishedPulling="2026-04-16 14:07:48.127163902 +0000 UTC m=+496.805670768" observedRunningTime="2026-04-16 14:07:48.552782138 +0000 UTC m=+497.231289037" watchObservedRunningTime="2026-04-16 14:07:48.554109334 +0000 UTC m=+497.232616222" Apr 16 14:07:49.510903 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:49.510862 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:07:49.511372 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:49.510866 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:07:49.511372 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:49.510866 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:07:59.511429 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:59.511383 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:07:59.511429 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:59.511402 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:07:59.511889 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:07:59.511402 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:08:09.511882 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:09.511839 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:08:09.512315 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:09.511842 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:08:09.512315 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:09.511842 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:08:19.511478 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:19.511400 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:08:19.511837 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:19.511396 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:08:19.511837 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:19.511396 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:08:29.511570 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:29.511532 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:08:29.511924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:29.511532 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:08:29.511924 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:29.511532 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:08:32.534650 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.534620 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4"] Apr 16 14:08:32.539279 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.539261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:32.541712 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.541686 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b65bd-serving-cert\"" Apr 16 14:08:32.541830 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.541712 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:08:32.541830 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.541697 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b65bd-kube-rbac-proxy-sar-config\"" Apr 16 14:08:32.548099 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.548076 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4"] Apr 16 14:08:32.675506 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.675479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ddac6d-b172-44b4-85de-3488958a876b-openshift-service-ca-bundle\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:32.675638 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.675530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:32.775919 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.775890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:32.776044 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.775976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ddac6d-b172-44b4-85de-3488958a876b-openshift-service-ca-bundle\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:32.776044 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:08:32.776034 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-b65bd-serving-cert: secret "switch-graph-b65bd-serving-cert" not found Apr 16 14:08:32.776156 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:08:32.776108 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls podName:61ddac6d-b172-44b4-85de-3488958a876b nodeName:}" failed. No retries permitted until 2026-04-16 14:08:33.276090896 +0000 UTC m=+541.954597762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls") pod "switch-graph-b65bd-66566fb599-49wh4" (UID: "61ddac6d-b172-44b4-85de-3488958a876b") : secret "switch-graph-b65bd-serving-cert" not found Apr 16 14:08:32.776628 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:32.776607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ddac6d-b172-44b4-85de-3488958a876b-openshift-service-ca-bundle\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:33.279915 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:33.279884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:33.282371 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:33.282324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls\") pod \"switch-graph-b65bd-66566fb599-49wh4\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:33.449789 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:33.449762 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:33.573317 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:33.573287 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4"] Apr 16 14:08:33.575124 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:08:33.575098 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ddac6d_b172_44b4_85de_3488958a876b.slice/crio-99b648f900297dfdc9b7ee220531897cf1ddc3a85a80746384f3c1baa4edd34a WatchSource:0}: Error finding container 99b648f900297dfdc9b7ee220531897cf1ddc3a85a80746384f3c1baa4edd34a: Status 404 returned error can't find the container with id 99b648f900297dfdc9b7ee220531897cf1ddc3a85a80746384f3c1baa4edd34a Apr 16 14:08:33.650009 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:33.649980 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" event={"ID":"61ddac6d-b172-44b4-85de-3488958a876b","Type":"ContainerStarted","Data":"99b648f900297dfdc9b7ee220531897cf1ddc3a85a80746384f3c1baa4edd34a"} Apr 16 14:08:36.661231 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:36.661151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" event={"ID":"61ddac6d-b172-44b4-85de-3488958a876b","Type":"ContainerStarted","Data":"5d907047ed54f748dbaab23ccc5181a6d5e7e9a0db81594659fb8c165ea8bb81"} Apr 16 14:08:36.661601 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:36.661299 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:36.677854 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:36.677820 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podStartSLOduration=1.982231762 podStartE2EDuration="4.677809782s" podCreationTimestamp="2026-04-16 14:08:32 +0000 UTC" firstStartedPulling="2026-04-16 14:08:33.577269977 +0000 UTC m=+542.255777060" lastFinishedPulling="2026-04-16 14:08:36.272848214 +0000 UTC m=+544.951355080" observedRunningTime="2026-04-16 14:08:36.676801039 +0000 UTC m=+545.355307926" watchObservedRunningTime="2026-04-16 14:08:36.677809782 +0000 UTC m=+545.356316670" Apr 16 14:08:39.511033 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:39.510995 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 14:08:39.511518 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:39.510995 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 14:08:39.511518 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:39.510995 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 14:08:42.669749 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:42.669721 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:08:46.739191 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:46.739161 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4"] Apr 16 14:08:46.739610 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:46.739435 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" containerID="cri-o://5d907047ed54f748dbaab23ccc5181a6d5e7e9a0db81594659fb8c165ea8bb81" gracePeriod=30 Apr 16 14:08:47.668308 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:47.668272 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:08:49.512034 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:49.512005 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:08:49.512448 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:49.512062 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:08:49.512448 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:49.512092 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:08:52.668493 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:52.668446 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:08:57.669511 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:57.669467 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:08:57.669937 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:08:57.669585 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:09:02.668890 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:02.668848 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:07.668789 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:07.668740 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:12.511217 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.511186 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q"] Apr 16 14:09:12.514732 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.514711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.517034 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.517010 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 14:09:12.517132 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.517010 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 14:09:12.522668 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.522645 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q"] Apr 16 14:09:12.567560 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.567536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-proxy-tls\") pod \"model-chainer-54c98cd4d8-cnq9q\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.567678 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.567584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-openshift-service-ca-bundle\") pod \"model-chainer-54c98cd4d8-cnq9q\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.668486 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.668461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-openshift-service-ca-bundle\") pod \"model-chainer-54c98cd4d8-cnq9q\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.668634 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.668555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-proxy-tls\") pod \"model-chainer-54c98cd4d8-cnq9q\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.668750 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.668722 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:12.669101 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.669078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-openshift-service-ca-bundle\") pod \"model-chainer-54c98cd4d8-cnq9q\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.671052 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.671032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-proxy-tls\") pod \"model-chainer-54c98cd4d8-cnq9q\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.826184 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.826124 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:12.945995 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:12.945973 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q"] Apr 16 14:09:12.948234 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:09:12.948205 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-7a88316b29ec01b6613084a6f10840653885212f40cd046e946c00ec23787336 WatchSource:0}: Error finding container 7a88316b29ec01b6613084a6f10840653885212f40cd046e946c00ec23787336: Status 404 returned error can't find the container with id 7a88316b29ec01b6613084a6f10840653885212f40cd046e946c00ec23787336 Apr 16 14:09:13.784419 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:13.784368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" event={"ID":"ea6873c4-d5d4-49cb-af4b-3ed34246bad3","Type":"ContainerStarted","Data":"26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012"} Apr 16 14:09:13.784419 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:13.784426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" event={"ID":"ea6873c4-d5d4-49cb-af4b-3ed34246bad3","Type":"ContainerStarted","Data":"7a88316b29ec01b6613084a6f10840653885212f40cd046e946c00ec23787336"} Apr 16 14:09:13.784798 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:13.784458 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:13.801020 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:13.800974 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podStartSLOduration=1.8009616099999999 podStartE2EDuration="1.80096161s" podCreationTimestamp="2026-04-16 14:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:13.798770172 +0000 UTC m=+582.477277057" watchObservedRunningTime="2026-04-16 14:09:13.80096161 +0000 UTC m=+582.479468498" Apr 16 14:09:16.796320 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:16.796289 2572 generic.go:358] "Generic (PLEG): container finished" podID="61ddac6d-b172-44b4-85de-3488958a876b" containerID="5d907047ed54f748dbaab23ccc5181a6d5e7e9a0db81594659fb8c165ea8bb81" exitCode=0 Apr 16 14:09:16.796658 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:16.796364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" event={"ID":"61ddac6d-b172-44b4-85de-3488958a876b","Type":"ContainerDied","Data":"5d907047ed54f748dbaab23ccc5181a6d5e7e9a0db81594659fb8c165ea8bb81"} Apr 16 14:09:16.882114 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:16.882091 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:09:17.002937 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.002883 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ddac6d-b172-44b4-85de-3488958a876b-openshift-service-ca-bundle\") pod \"61ddac6d-b172-44b4-85de-3488958a876b\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " Apr 16 14:09:17.002937 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.002926 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls\") pod \"61ddac6d-b172-44b4-85de-3488958a876b\" (UID: \"61ddac6d-b172-44b4-85de-3488958a876b\") " Apr 16 14:09:17.003156 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.003135 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ddac6d-b172-44b4-85de-3488958a876b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "61ddac6d-b172-44b4-85de-3488958a876b" (UID: "61ddac6d-b172-44b4-85de-3488958a876b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:09:17.005060 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.005041 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "61ddac6d-b172-44b4-85de-3488958a876b" (UID: "61ddac6d-b172-44b4-85de-3488958a876b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:09:17.104125 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.104099 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ddac6d-b172-44b4-85de-3488958a876b-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:17.104125 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.104122 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ddac6d-b172-44b4-85de-3488958a876b-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:17.800540 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.800504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" event={"ID":"61ddac6d-b172-44b4-85de-3488958a876b","Type":"ContainerDied","Data":"99b648f900297dfdc9b7ee220531897cf1ddc3a85a80746384f3c1baa4edd34a"} Apr 16 14:09:17.800945 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.800547 2572 scope.go:117] "RemoveContainer" containerID="5d907047ed54f748dbaab23ccc5181a6d5e7e9a0db81594659fb8c165ea8bb81" Apr 16 14:09:17.800945 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.800521 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4" Apr 16 14:09:17.825654 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.825632 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4"] Apr 16 14:09:17.831127 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.831106 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b65bd-66566fb599-49wh4"] Apr 16 14:09:17.902955 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:17.902930 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ddac6d-b172-44b4-85de-3488958a876b" path="/var/lib/kubelet/pods/61ddac6d-b172-44b4-85de-3488958a876b/volumes" Apr 16 14:09:19.793850 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:19.793818 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:22.609597 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.609564 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q"] Apr 16 14:09:22.610049 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.609824 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" containerID="cri-o://26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012" gracePeriod=30 Apr 16 14:09:22.681746 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.681717 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55"] Apr 16 14:09:22.681992 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.681969 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" containerID="cri-o://2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d" gracePeriod=30 Apr 16 14:09:22.730614 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.730577 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228"] Apr 16 14:09:22.730865 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.730827 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" containerID="cri-o://210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94" gracePeriod=30 Apr 16 14:09:22.784624 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.784591 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k"] Apr 16 14:09:22.784874 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:22.784850 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" containerID="cri-o://4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372" gracePeriod=30 Apr 16 14:09:24.792978 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:24.792941 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:26.223389 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.223359 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:09:26.278252 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.278228 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6a68343-1673-4124-a36a-176be7f5aa7b-kserve-provision-location\") pod \"f6a68343-1673-4124-a36a-176be7f5aa7b\" (UID: \"f6a68343-1673-4124-a36a-176be7f5aa7b\") " Apr 16 14:09:26.278540 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.278520 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a68343-1673-4124-a36a-176be7f5aa7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6a68343-1673-4124-a36a-176be7f5aa7b" (UID: "f6a68343-1673-4124-a36a-176be7f5aa7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:26.378833 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.378771 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6a68343-1673-4124-a36a-176be7f5aa7b-kserve-provision-location\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:26.558162 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.558141 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:09:26.681592 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.681526 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b6be662-4ac7-4979-9a73-266d1010c89a-kserve-provision-location\") pod \"4b6be662-4ac7-4979-9a73-266d1010c89a\" (UID: \"4b6be662-4ac7-4979-9a73-266d1010c89a\") " Apr 16 14:09:26.681814 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.681794 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6be662-4ac7-4979-9a73-266d1010c89a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4b6be662-4ac7-4979-9a73-266d1010c89a" (UID: "4b6be662-4ac7-4979-9a73-266d1010c89a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:26.782012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.781988 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b6be662-4ac7-4979-9a73-266d1010c89a-kserve-provision-location\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:26.830263 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.830230 2572 generic.go:358] "Generic (PLEG): container finished" podID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerID="210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94" exitCode=0 Apr 16 14:09:26.830396 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.830298 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" Apr 16 14:09:26.830396 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.830350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" event={"ID":"4b6be662-4ac7-4979-9a73-266d1010c89a","Type":"ContainerDied","Data":"210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94"} Apr 16 14:09:26.830396 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.830379 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228" event={"ID":"4b6be662-4ac7-4979-9a73-266d1010c89a","Type":"ContainerDied","Data":"310d27f43ff5bf01a14963c17f6a64724d6671d67e265d499a1c2537165c5bf0"} Apr 16 14:09:26.830544 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.830398 2572 scope.go:117] "RemoveContainer" containerID="210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94" Apr 16 14:09:26.831843 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.831818 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerID="2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d" exitCode=0 Apr 16 14:09:26.831967 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.831896 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" Apr 16 14:09:26.832080 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.831888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" event={"ID":"f6a68343-1673-4124-a36a-176be7f5aa7b","Type":"ContainerDied","Data":"2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d"} Apr 16 14:09:26.832080 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.832000 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55" event={"ID":"f6a68343-1673-4124-a36a-176be7f5aa7b","Type":"ContainerDied","Data":"f43ee3cb5e6803cdc8f6fed9bb27e287fd47c7770d24dd0c1b9ea84785c069af"} Apr 16 14:09:26.839456 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.839403 2572 scope.go:117] "RemoveContainer" containerID="3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a" Apr 16 14:09:26.846806 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.846785 2572 scope.go:117] "RemoveContainer" containerID="210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94" Apr 16 14:09:26.847048 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:26.847032 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94\": container with ID starting with 210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94 not found: ID does not exist" containerID="210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94" Apr 16 14:09:26.847090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.847056 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94"} err="failed to get container status \"210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94\": rpc error: code = NotFound desc = could not find container \"210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94\": container with ID starting with 210c15f6ba66ea0a4c70ee0c9c53f9ddd8e6362bb768fe264a043a29414c3d94 not found: ID does not exist" Apr 16 14:09:26.847090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.847072 2572 scope.go:117] "RemoveContainer" containerID="3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a" Apr 16 14:09:26.847269 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:26.847251 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a\": container with ID starting with 3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a not found: ID does not exist" containerID="3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a" Apr 16 14:09:26.847315 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.847276 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a"} err="failed to get container status \"3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a\": rpc error: code = NotFound desc = could not find container \"3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a\": container with ID starting with 3b399012b1e03f61283eaa23a1b24bcfd56719d406cf704482c85a3457ece37a not found: ID does not exist" Apr 16 14:09:26.847315 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.847293 2572 scope.go:117] "RemoveContainer" containerID="2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d" Apr 16 14:09:26.851041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.851022 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228"] Apr 16 14:09:26.855016 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.854993 2572 scope.go:117] "RemoveContainer" containerID="c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406" Apr 16 14:09:26.856676 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.856658 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-bcbc7cdf4-rp228"] Apr 16 14:09:26.861789 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.861775 2572 scope.go:117] "RemoveContainer" containerID="2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d" Apr 16 14:09:26.862025 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:26.862007 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d\": container with ID starting with 2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d not found: ID does not exist" containerID="2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d" Apr 16 14:09:26.862098 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.862036 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d"} err="failed to get container status \"2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d\": rpc error: code = NotFound desc = could not find container \"2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d\": container with ID starting with 2ff92cbae3a9ff6b40ede13a706c42ba712b57532b77c62a56b0066c5a32e86d not found: ID does not exist" Apr 16 14:09:26.862098 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.862058 2572 scope.go:117] "RemoveContainer" containerID="c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406" Apr 16 14:09:26.862265 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:26.862249 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406\": container with ID starting with c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406 not found: ID does not exist" containerID="c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406" Apr 16 14:09:26.862315 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.862271 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406"} err="failed to get container status \"c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406\": rpc error: code = NotFound desc = could not find container \"c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406\": container with ID starting with c7489a602ab4d1d738342115a718b638b4d63b7d4de7897adf7c993e691ad406 not found: ID does not exist" Apr 16 14:09:26.866122 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.866098 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55"] Apr 16 14:09:26.871251 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:26.871230 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4xk55"] Apr 16 14:09:27.230281 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.230260 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:09:27.286550 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.286489 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b-kserve-provision-location\") pod \"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b\" (UID: \"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b\") " Apr 16 14:09:27.286806 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.286780 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" (UID: "fa431d5e-97ef-4a0b-b2f7-9a17ee01716b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:27.387634 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.387610 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b-kserve-provision-location\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:27.838188 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.838156 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerID="4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372" exitCode=0 Apr 16 14:09:27.838392 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.838196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" event={"ID":"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b","Type":"ContainerDied","Data":"4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372"} Apr 16 14:09:27.838392 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.838221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" event={"ID":"fa431d5e-97ef-4a0b-b2f7-9a17ee01716b","Type":"ContainerDied","Data":"59870484d53a29fbfc080a8df4894d09e58ef9a93722ba0ef3d7e451bad1b1de"} Apr 16 14:09:27.838392 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.838236 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k" Apr 16 14:09:27.838392 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.838243 2572 scope.go:117] "RemoveContainer" containerID="4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372" Apr 16 14:09:27.846802 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.846786 2572 scope.go:117] "RemoveContainer" containerID="9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b" Apr 16 14:09:27.853983 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.853967 2572 scope.go:117] "RemoveContainer" containerID="4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372" Apr 16 14:09:27.854214 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:27.854197 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372\": container with ID starting with 4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372 not found: ID does not exist" containerID="4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372" Apr 16 14:09:27.854265 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.854221 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372"} err="failed to get container status \"4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372\": rpc error: code = NotFound desc = could not find container \"4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372\": container with ID starting with 4449b34cfcdfc3c11b59910dc407fbaf4446b57dd1d020398fdd31c08e66f372 not found: ID does not exist" Apr 16 14:09:27.854265 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.854238 2572 scope.go:117] "RemoveContainer" containerID="9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b" Apr 16 14:09:27.854469 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:27.854453 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b\": container with ID starting with 9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b not found: ID does not exist" containerID="9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b" Apr 16 14:09:27.854520 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.854476 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b"} err="failed to get container status \"9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b\": rpc error: code = NotFound desc = could not find container \"9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b\": container with ID starting with 9d91fabb19278e4718160ce3530e6980da473a786e99908ca69c6b53aea1cb7b not found: ID does not exist" Apr 16 14:09:27.857922 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.857775 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k"] Apr 16 14:09:27.859679 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.859660 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6fd8964888-gkf4k"] Apr 16 14:09:27.902889 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.902868 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" path="/var/lib/kubelet/pods/4b6be662-4ac7-4979-9a73-266d1010c89a/volumes" Apr 16 14:09:27.903212 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.903200 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" path="/var/lib/kubelet/pods/f6a68343-1673-4124-a36a-176be7f5aa7b/volumes" Apr 16 14:09:27.903556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:27.903542 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" path="/var/lib/kubelet/pods/fa431d5e-97ef-4a0b-b2f7-9a17ee01716b/volumes" Apr 16 14:09:29.792495 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:29.792460 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:34.792855 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:34.792811 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:34.793171 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:34.792916 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:39.792181 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:39.792142 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:44.792675 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:44.792626 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:49.792976 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:49.792885 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:52.663144 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:52.663094 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-conmon-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:09:52.663698 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:52.663161 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-conmon-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:09:52.663802 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:52.663178 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-conmon-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:09:52.663802 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:52.663284 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-conmon-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6873c4_d5d4_49cb_af4b_3ed34246bad3.slice/crio-26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:09:52.792241 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.792218 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:52.922531 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.922462 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerID="26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012" exitCode=0 Apr 16 14:09:52.922531 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.922518 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" Apr 16 14:09:52.922736 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.922547 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" event={"ID":"ea6873c4-d5d4-49cb-af4b-3ed34246bad3","Type":"ContainerDied","Data":"26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012"} Apr 16 14:09:52.922736 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.922601 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q" event={"ID":"ea6873c4-d5d4-49cb-af4b-3ed34246bad3","Type":"ContainerDied","Data":"7a88316b29ec01b6613084a6f10840653885212f40cd046e946c00ec23787336"} Apr 16 14:09:52.922736 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.922623 2572 scope.go:117] "RemoveContainer" containerID="26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012" Apr 16 14:09:52.930013 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.929996 2572 scope.go:117] "RemoveContainer" containerID="26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012" Apr 16 14:09:52.930260 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:52.930243 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012\": container with ID starting with 26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012 not found: ID does not exist" containerID="26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012" Apr 16 14:09:52.930302 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.930268 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012"} err="failed to get container status \"26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012\": rpc error: code = NotFound desc = could not find container \"26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012\": container with ID starting with 26c27b34ca3f1cc7e3de8ae4214668cca802141190060300dd97a673c2073012 not found: ID does not exist" Apr 16 14:09:52.976635 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.976612 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-proxy-tls\") pod \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " Apr 16 14:09:52.976719 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.976682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-openshift-service-ca-bundle\") pod \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\" (UID: \"ea6873c4-d5d4-49cb-af4b-3ed34246bad3\") " Apr 16 14:09:52.977012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.976990 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ea6873c4-d5d4-49cb-af4b-3ed34246bad3" (UID: "ea6873c4-d5d4-49cb-af4b-3ed34246bad3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:09:52.978793 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:52.978774 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea6873c4-d5d4-49cb-af4b-3ed34246bad3" (UID: "ea6873c4-d5d4-49cb-af4b-3ed34246bad3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:09:53.077387 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:53.077362 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:53.077387 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:53.077387 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6873c4-d5d4-49cb-af4b-3ed34246bad3-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:09:53.241053 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:53.241028 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q"] Apr 16 14:09:53.243290 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:53.243270 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-54c98cd4d8-cnq9q"] Apr 16 14:09:53.903822 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:53.903785 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" path="/var/lib/kubelet/pods/ea6873c4-d5d4-49cb-af4b-3ed34246bad3/volumes" Apr 16 14:09:56.967628 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967597 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d"] Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967904 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="storage-initializer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967915 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="storage-initializer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967925 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="storage-initializer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967930 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="storage-initializer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967939 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967944 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967952 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="storage-initializer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967957 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="storage-initializer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967963 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967968 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967974 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967979 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967986 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.967992 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968005 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" Apr 16 14:09:56.968012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968011 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" Apr 16 14:09:56.968556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968055 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa431d5e-97ef-4a0b-b2f7-9a17ee01716b" containerName="kserve-container" Apr 16 14:09:56.968556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968065 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6a68343-1673-4124-a36a-176be7f5aa7b" containerName="kserve-container" Apr 16 14:09:56.968556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968071 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b6be662-4ac7-4979-9a73-266d1010c89a" containerName="kserve-container" Apr 16 14:09:56.968556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968076 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea6873c4-d5d4-49cb-af4b-3ed34246bad3" containerName="model-chainer" Apr 16 14:09:56.968556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.968085 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="61ddac6d-b172-44b4-85de-3488958a876b" containerName="switch-graph-b65bd" Apr 16 14:09:56.972908 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.972893 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:56.975505 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.975479 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-7cd29-kube-rbac-proxy-sar-config\"" Apr 16 14:09:56.975648 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.975528 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-7cd29-serving-cert\"" Apr 16 14:09:56.975648 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.975528 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4rnlz\"" Apr 16 14:09:56.976493 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.976468 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:09:56.980353 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:56.980268 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d"] Apr 16 14:09:57.105728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.105698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e90e6a5-9395-4b19-baca-1fd5c8add502-openshift-service-ca-bundle\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.105857 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.105734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.206804 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.206775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e90e6a5-9395-4b19-baca-1fd5c8add502-openshift-service-ca-bundle\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.206930 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.206811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.206930 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:57.206927 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-7cd29-serving-cert: secret "switch-graph-7cd29-serving-cert" not found Apr 16 14:09:57.207030 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:09:57.207001 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls podName:2e90e6a5-9395-4b19-baca-1fd5c8add502 nodeName:}" failed. No retries permitted until 2026-04-16 14:09:57.706985307 +0000 UTC m=+626.385492177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls") pod "switch-graph-7cd29-bcdbb89fd-mxb6d" (UID: "2e90e6a5-9395-4b19-baca-1fd5c8add502") : secret "switch-graph-7cd29-serving-cert" not found Apr 16 14:09:57.207397 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.207378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e90e6a5-9395-4b19-baca-1fd5c8add502-openshift-service-ca-bundle\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.710974 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.710943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.713472 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.713450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls\") pod \"switch-graph-7cd29-bcdbb89fd-mxb6d\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:57.887978 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:57.887952 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:58.003797 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:58.003771 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d"] Apr 16 14:09:58.006998 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:09:58.006970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e90e6a5_9395_4b19_baca_1fd5c8add502.slice/crio-b2b31d57b15591c9d16c8f9c1eec7ee4d4ad0f617b6024362018fedc951de663 WatchSource:0}: Error finding container b2b31d57b15591c9d16c8f9c1eec7ee4d4ad0f617b6024362018fedc951de663: Status 404 returned error can't find the container with id b2b31d57b15591c9d16c8f9c1eec7ee4d4ad0f617b6024362018fedc951de663 Apr 16 14:09:58.945037 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:58.944998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" event={"ID":"2e90e6a5-9395-4b19-baca-1fd5c8add502","Type":"ContainerStarted","Data":"ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae"} Apr 16 14:09:58.945037 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:58.945040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" event={"ID":"2e90e6a5-9395-4b19-baca-1fd5c8add502","Type":"ContainerStarted","Data":"b2b31d57b15591c9d16c8f9c1eec7ee4d4ad0f617b6024362018fedc951de663"} Apr 16 14:09:58.945289 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:58.945111 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:09:58.960557 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:09:58.960514 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podStartSLOduration=2.960502998 podStartE2EDuration="2.960502998s" podCreationTimestamp="2026-04-16 14:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:58.958898508 +0000 UTC m=+627.637405396" watchObservedRunningTime="2026-04-16 14:09:58.960502998 +0000 UTC m=+627.639009886" Apr 16 14:10:04.953964 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:04.953930 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:10:32.798654 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.798613 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc"] Apr 16 14:10:32.803776 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.803748 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:32.806058 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.806033 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3604c-kube-rbac-proxy-sar-config\"" Apr 16 14:10:32.806261 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.806235 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc"] Apr 16 14:10:32.806411 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.806058 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3604c-serving-cert\"" Apr 16 14:10:32.861448 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.861421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:32.861548 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.861455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ed9b14-a770-4f0d-a350-e3050af1b638-openshift-service-ca-bundle\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:32.961825 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.961800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:32.961917 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.961835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ed9b14-a770-4f0d-a350-e3050af1b638-openshift-service-ca-bundle\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:32.961971 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:10:32.961945 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-3604c-serving-cert: secret "sequence-graph-3604c-serving-cert" not found Apr 16 14:10:32.962035 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:10:32.962024 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls podName:c4ed9b14-a770-4f0d-a350-e3050af1b638 nodeName:}" failed. No retries permitted until 2026-04-16 14:10:33.462003062 +0000 UTC m=+662.140509934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls") pod "sequence-graph-3604c-5c94498546-6xnhc" (UID: "c4ed9b14-a770-4f0d-a350-e3050af1b638") : secret "sequence-graph-3604c-serving-cert" not found Apr 16 14:10:32.962511 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:32.962494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ed9b14-a770-4f0d-a350-e3050af1b638-openshift-service-ca-bundle\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:33.465427 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:33.465398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:33.467800 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:33.467773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls\") pod \"sequence-graph-3604c-5c94498546-6xnhc\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:33.715583 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:33.715505 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:33.833984 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:33.833840 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc"] Apr 16 14:10:33.836067 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:10:33.836044 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ed9b14_a770_4f0d_a350_e3050af1b638.slice/crio-26e3b67048731050ddd7ef19b4f8e03047f1045e3c57ab852700b51057ee85cb WatchSource:0}: Error finding container 26e3b67048731050ddd7ef19b4f8e03047f1045e3c57ab852700b51057ee85cb: Status 404 returned error can't find the container with id 26e3b67048731050ddd7ef19b4f8e03047f1045e3c57ab852700b51057ee85cb Apr 16 14:10:33.837949 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:33.837929 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:10:34.049083 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:34.049055 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" event={"ID":"c4ed9b14-a770-4f0d-a350-e3050af1b638","Type":"ContainerStarted","Data":"4c16212392464471666e3f23172e2645cac7c4431593eca502869ae1fb144404"} Apr 16 14:10:34.049201 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:34.049090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" event={"ID":"c4ed9b14-a770-4f0d-a350-e3050af1b638","Type":"ContainerStarted","Data":"26e3b67048731050ddd7ef19b4f8e03047f1045e3c57ab852700b51057ee85cb"} Apr 16 14:10:34.049201 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:34.049115 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:10:34.064876 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:34.064837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podStartSLOduration=2.064822856 podStartE2EDuration="2.064822856s" podCreationTimestamp="2026-04-16 14:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:10:34.06263365 +0000 UTC m=+662.741140537" watchObservedRunningTime="2026-04-16 14:10:34.064822856 +0000 UTC m=+662.743329744" Apr 16 14:10:40.058491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:10:40.058464 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:18:11.768063 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:11.768033 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d"] Apr 16 14:18:11.768558 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:11.768253 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" containerID="cri-o://ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae" gracePeriod=30 Apr 16 14:18:14.952152 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:14.952096 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:19.951551 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:19.951515 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:24.952015 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:24.951966 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:24.952404 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:24.952105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:18:29.952499 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:29.952456 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:34.952540 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:34.952493 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:39.952478 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:39.952441 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:41.925152 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:41.925127 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:18:41.977372 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:41.977330 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls\") pod \"2e90e6a5-9395-4b19-baca-1fd5c8add502\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " Apr 16 14:18:41.977497 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:41.977415 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e90e6a5-9395-4b19-baca-1fd5c8add502-openshift-service-ca-bundle\") pod \"2e90e6a5-9395-4b19-baca-1fd5c8add502\" (UID: \"2e90e6a5-9395-4b19-baca-1fd5c8add502\") " Apr 16 14:18:41.977746 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:41.977722 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e90e6a5-9395-4b19-baca-1fd5c8add502-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2e90e6a5-9395-4b19-baca-1fd5c8add502" (UID: "2e90e6a5-9395-4b19-baca-1fd5c8add502"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:18:41.979443 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:41.979418 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e90e6a5-9395-4b19-baca-1fd5c8add502" (UID: "2e90e6a5-9395-4b19-baca-1fd5c8add502"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:18:42.078779 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.078733 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e90e6a5-9395-4b19-baca-1fd5c8add502-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:18:42.078779 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.078751 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e90e6a5-9395-4b19-baca-1fd5c8add502-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:18:42.512882 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.512851 2572 generic.go:358] "Generic (PLEG): container finished" podID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerID="ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae" exitCode=0 Apr 16 14:18:42.513041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.512888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" event={"ID":"2e90e6a5-9395-4b19-baca-1fd5c8add502","Type":"ContainerDied","Data":"ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae"} Apr 16 14:18:42.513041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.512905 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" Apr 16 14:18:42.513041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.512919 2572 scope.go:117] "RemoveContainer" containerID="ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae" Apr 16 14:18:42.513041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.512910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d" event={"ID":"2e90e6a5-9395-4b19-baca-1fd5c8add502","Type":"ContainerDied","Data":"b2b31d57b15591c9d16c8f9c1eec7ee4d4ad0f617b6024362018fedc951de663"} Apr 16 14:18:42.521662 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.521642 2572 scope.go:117] "RemoveContainer" containerID="ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae" Apr 16 14:18:42.521900 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:18:42.521882 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae\": container with ID starting with ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae not found: ID does not exist" containerID="ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae" Apr 16 14:18:42.521954 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.521914 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae"} err="failed to get container status \"ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae\": rpc error: code = NotFound desc = could not find container \"ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae\": container with ID starting with ff2422c308169f2ecfc5ac4d91b3544424199adf26390d03e3038b868e6ffaae not found: ID does not exist" Apr 16 14:18:42.533442 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.533415 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d"] Apr 16 14:18:42.538738 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:42.538713 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-7cd29-bcdbb89fd-mxb6d"] Apr 16 14:18:43.902271 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:43.902236 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" path="/var/lib/kubelet/pods/2e90e6a5-9395-4b19-baca-1fd5c8add502/volumes" Apr 16 14:18:47.546711 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:47.546625 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc"] Apr 16 14:18:47.547141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:47.546850 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" containerID="cri-o://4c16212392464471666e3f23172e2645cac7c4431593eca502869ae1fb144404" gracePeriod=30 Apr 16 14:18:50.056406 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:50.056361 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:18:55.057939 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:18:55.057892 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:00.056550 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:00.056511 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:00.056918 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:00.056607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:19:05.056295 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:05.056259 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:10.057032 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:10.056992 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:15.056350 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:15.056297 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:17.623912 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.623883 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerID="4c16212392464471666e3f23172e2645cac7c4431593eca502869ae1fb144404" exitCode=0 Apr 16 14:19:17.624203 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.623965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" event={"ID":"c4ed9b14-a770-4f0d-a350-e3050af1b638","Type":"ContainerDied","Data":"4c16212392464471666e3f23172e2645cac7c4431593eca502869ae1fb144404"} Apr 16 14:19:17.682898 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.682879 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:19:17.832932 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.832877 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls\") pod \"c4ed9b14-a770-4f0d-a350-e3050af1b638\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " Apr 16 14:19:17.833034 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.832961 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ed9b14-a770-4f0d-a350-e3050af1b638-openshift-service-ca-bundle\") pod \"c4ed9b14-a770-4f0d-a350-e3050af1b638\" (UID: \"c4ed9b14-a770-4f0d-a350-e3050af1b638\") " Apr 16 14:19:17.833295 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.833275 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ed9b14-a770-4f0d-a350-e3050af1b638-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c4ed9b14-a770-4f0d-a350-e3050af1b638" (UID: "c4ed9b14-a770-4f0d-a350-e3050af1b638"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:19:17.834932 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.834911 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c4ed9b14-a770-4f0d-a350-e3050af1b638" (UID: "c4ed9b14-a770-4f0d-a350-e3050af1b638"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:19:17.933919 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.933897 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ed9b14-a770-4f0d-a350-e3050af1b638-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:19:17.933919 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:17.933919 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4ed9b14-a770-4f0d-a350-e3050af1b638-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:19:18.628230 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:18.628204 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" Apr 16 14:19:18.628230 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:18.628209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc" event={"ID":"c4ed9b14-a770-4f0d-a350-e3050af1b638","Type":"ContainerDied","Data":"26e3b67048731050ddd7ef19b4f8e03047f1045e3c57ab852700b51057ee85cb"} Apr 16 14:19:18.628693 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:18.628255 2572 scope.go:117] "RemoveContainer" containerID="4c16212392464471666e3f23172e2645cac7c4431593eca502869ae1fb144404" Apr 16 14:19:18.643108 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:18.643083 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc"] Apr 16 14:19:18.644845 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:18.644825 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3604c-5c94498546-6xnhc"] Apr 16 14:19:19.902832 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:19.902789 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" path="/var/lib/kubelet/pods/c4ed9b14-a770-4f0d-a350-e3050af1b638/volumes" Apr 16 14:19:22.003399 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003368 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4"] Apr 16 14:19:22.003768 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003691 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" Apr 16 14:19:22.003768 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003702 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" Apr 16 14:19:22.003768 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003727 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" Apr 16 14:19:22.003768 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003733 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" Apr 16 14:19:22.003895 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003779 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e90e6a5-9395-4b19-baca-1fd5c8add502" containerName="switch-graph-7cd29" Apr 16 14:19:22.003895 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.003790 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4ed9b14-a770-4f0d-a350-e3050af1b638" containerName="sequence-graph-3604c" Apr 16 14:19:22.008193 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.008175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.010599 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.010578 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-305b2-serving-cert\"" Apr 16 14:19:22.010738 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.010576 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-305b2-kube-rbac-proxy-sar-config\"" Apr 16 14:19:22.010738 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.010678 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4rnlz\"" Apr 16 14:19:22.011681 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.011647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:19:22.013724 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.013705 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4"] Apr 16 14:19:22.164144 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.164115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf2fc1d4-835e-497d-9beb-e2d613492a73-openshift-service-ca-bundle\") pod \"ensemble-graph-305b2-7c6bf4cb48-2gft4\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.164300 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.164171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf2fc1d4-835e-497d-9beb-e2d613492a73-proxy-tls\") pod \"ensemble-graph-305b2-7c6bf4cb48-2gft4\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.264608 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.264546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf2fc1d4-835e-497d-9beb-e2d613492a73-proxy-tls\") pod \"ensemble-graph-305b2-7c6bf4cb48-2gft4\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.264700 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.264623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf2fc1d4-835e-497d-9beb-e2d613492a73-openshift-service-ca-bundle\") pod \"ensemble-graph-305b2-7c6bf4cb48-2gft4\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.265694 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.265666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf2fc1d4-835e-497d-9beb-e2d613492a73-openshift-service-ca-bundle\") pod \"ensemble-graph-305b2-7c6bf4cb48-2gft4\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.271617 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.271596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf2fc1d4-835e-497d-9beb-e2d613492a73-proxy-tls\") pod \"ensemble-graph-305b2-7c6bf4cb48-2gft4\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.320375 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.320354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.434355 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.434310 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4"] Apr 16 14:19:22.436759 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:19:22.436731 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2fc1d4_835e_497d_9beb_e2d613492a73.slice/crio-95f61768fd976320fc791cbcf385eb6c19394bd847e319f90eea909563b2185e WatchSource:0}: Error finding container 95f61768fd976320fc791cbcf385eb6c19394bd847e319f90eea909563b2185e: Status 404 returned error can't find the container with id 95f61768fd976320fc791cbcf385eb6c19394bd847e319f90eea909563b2185e Apr 16 14:19:22.438468 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.438452 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:19:22.642913 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.642838 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" event={"ID":"cf2fc1d4-835e-497d-9beb-e2d613492a73","Type":"ContainerStarted","Data":"3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c"} Apr 16 14:19:22.642913 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.642878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" event={"ID":"cf2fc1d4-835e-497d-9beb-e2d613492a73","Type":"ContainerStarted","Data":"95f61768fd976320fc791cbcf385eb6c19394bd847e319f90eea909563b2185e"} Apr 16 14:19:22.643059 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.642969 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:22.658746 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:22.658694 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podStartSLOduration=1.658680877 podStartE2EDuration="1.658680877s" podCreationTimestamp="2026-04-16 14:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:22.657789851 +0000 UTC m=+1191.336296735" watchObservedRunningTime="2026-04-16 14:19:22.658680877 +0000 UTC m=+1191.337187764" Apr 16 14:19:28.651765 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:28.651729 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:32.063963 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:32.063935 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4"] Apr 16 14:19:32.064426 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:32.064197 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" containerID="cri-o://3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c" gracePeriod=30 Apr 16 14:19:33.649613 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:33.649571 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:38.650511 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:38.650468 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:43.649843 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:43.649807 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:43.650200 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:43.649897 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:19:48.650626 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:48.650585 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:53.650128 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:53.650094 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:57.722956 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.722923 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg"] Apr 16 14:19:57.726047 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.726031 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:57.728441 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.728419 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-a84b0-serving-cert\"" Apr 16 14:19:57.728441 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.728426 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-a84b0-kube-rbac-proxy-sar-config\"" Apr 16 14:19:57.731801 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.731765 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg"] Apr 16 14:19:57.813952 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.813928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-openshift-service-ca-bundle\") pod \"sequence-graph-a84b0-6cc45b7f74-r64lg\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:57.814055 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.813975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-proxy-tls\") pod \"sequence-graph-a84b0-6cc45b7f74-r64lg\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:57.914795 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.914772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-openshift-service-ca-bundle\") pod \"sequence-graph-a84b0-6cc45b7f74-r64lg\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:57.914912 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.914820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-proxy-tls\") pod \"sequence-graph-a84b0-6cc45b7f74-r64lg\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:57.915389 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.915366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-openshift-service-ca-bundle\") pod \"sequence-graph-a84b0-6cc45b7f74-r64lg\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:57.917194 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:57.917176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-proxy-tls\") pod \"sequence-graph-a84b0-6cc45b7f74-r64lg\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:58.036935 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.036906 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:58.155554 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.155524 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg"] Apr 16 14:19:58.158810 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:19:58.158780 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0c4e7a_bff7_404c_aa9a_7473711fe5f2.slice/crio-6cc1dad7d4d61fe3cdfddcd6fad4cdf3403b9fcfe739946d90a7684cb86628cf WatchSource:0}: Error finding container 6cc1dad7d4d61fe3cdfddcd6fad4cdf3403b9fcfe739946d90a7684cb86628cf: Status 404 returned error can't find the container with id 6cc1dad7d4d61fe3cdfddcd6fad4cdf3403b9fcfe739946d90a7684cb86628cf Apr 16 14:19:58.649884 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.649848 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:19:58.751869 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.751844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" event={"ID":"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2","Type":"ContainerStarted","Data":"b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c"} Apr 16 14:19:58.751869 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.751877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" event={"ID":"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2","Type":"ContainerStarted","Data":"6cc1dad7d4d61fe3cdfddcd6fad4cdf3403b9fcfe739946d90a7684cb86628cf"} Apr 16 14:19:58.752316 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.751966 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:19:58.768371 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:19:58.768312 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podStartSLOduration=1.7682995670000001 podStartE2EDuration="1.768299567s" podCreationTimestamp="2026-04-16 14:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:58.767115877 +0000 UTC m=+1227.445622764" watchObservedRunningTime="2026-04-16 14:19:58.768299567 +0000 UTC m=+1227.446806455" Apr 16 14:20:02.204917 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.204891 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:20:02.247869 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.247842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf2fc1d4-835e-497d-9beb-e2d613492a73-openshift-service-ca-bundle\") pod \"cf2fc1d4-835e-497d-9beb-e2d613492a73\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " Apr 16 14:20:02.248004 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.247899 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf2fc1d4-835e-497d-9beb-e2d613492a73-proxy-tls\") pod \"cf2fc1d4-835e-497d-9beb-e2d613492a73\" (UID: \"cf2fc1d4-835e-497d-9beb-e2d613492a73\") " Apr 16 14:20:02.248197 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.248175 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2fc1d4-835e-497d-9beb-e2d613492a73-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cf2fc1d4-835e-497d-9beb-e2d613492a73" (UID: "cf2fc1d4-835e-497d-9beb-e2d613492a73"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:20:02.249988 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.249968 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2fc1d4-835e-497d-9beb-e2d613492a73-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cf2fc1d4-835e-497d-9beb-e2d613492a73" (UID: "cf2fc1d4-835e-497d-9beb-e2d613492a73"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:20:02.348567 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.348513 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf2fc1d4-835e-497d-9beb-e2d613492a73-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:20:02.348567 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.348535 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf2fc1d4-835e-497d-9beb-e2d613492a73-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:20:02.765815 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.765782 2572 generic.go:358] "Generic (PLEG): container finished" podID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerID="3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c" exitCode=0 Apr 16 14:20:02.765967 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.765844 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" Apr 16 14:20:02.765967 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.765871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" event={"ID":"cf2fc1d4-835e-497d-9beb-e2d613492a73","Type":"ContainerDied","Data":"3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c"} Apr 16 14:20:02.765967 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.765910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4" event={"ID":"cf2fc1d4-835e-497d-9beb-e2d613492a73","Type":"ContainerDied","Data":"95f61768fd976320fc791cbcf385eb6c19394bd847e319f90eea909563b2185e"} Apr 16 14:20:02.765967 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.765927 2572 scope.go:117] "RemoveContainer" containerID="3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c" Apr 16 14:20:02.776599 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.776288 2572 scope.go:117] "RemoveContainer" containerID="3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c" Apr 16 14:20:02.776943 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:20:02.776918 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c\": container with ID starting with 3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c not found: ID does not exist" containerID="3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c" Apr 16 14:20:02.777018 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.776953 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c"} err="failed to get container status \"3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c\": rpc error: code = NotFound desc = could not find container \"3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c\": container with ID starting with 3a9165c40abb81bb4598ff917935fa9c2ae62dff4b083c3e6c6da382710fe99c not found: ID does not exist" Apr 16 14:20:02.786242 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.786216 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4"] Apr 16 14:20:02.791697 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:02.791678 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-305b2-7c6bf4cb48-2gft4"] Apr 16 14:20:03.902118 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:03.902087 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" path="/var/lib/kubelet/pods/cf2fc1d4-835e-497d-9beb-e2d613492a73/volumes" Apr 16 14:20:04.760044 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:04.760008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:20:07.834177 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:07.834144 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg"] Apr 16 14:20:07.834556 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:07.834388 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" containerID="cri-o://b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c" gracePeriod=30 Apr 16 14:20:09.759136 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:09.759092 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:20:14.759568 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:14.759518 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:20:19.759656 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:19.759575 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:20:19.760059 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:19.759682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:20:24.759240 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:24.759184 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:20:29.758941 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:29.758895 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:20:34.759203 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:34.759167 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:20:38.021964 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.021942 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:20:38.105900 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.105874 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-openshift-service-ca-bundle\") pod \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " Apr 16 14:20:38.106035 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.105906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-proxy-tls\") pod \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\" (UID: \"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2\") " Apr 16 14:20:38.106244 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.106218 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" (UID: "dd0c4e7a-bff7-404c-aa9a-7473711fe5f2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:20:38.108008 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.107986 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" (UID: "dd0c4e7a-bff7-404c-aa9a-7473711fe5f2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:20:38.206641 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.206576 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.206641 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.206609 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.876871 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.876830 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerID="b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c" exitCode=0 Apr 16 14:20:38.877029 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.876894 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" event={"ID":"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2","Type":"ContainerDied","Data":"b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c"} Apr 16 14:20:38.877029 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.876928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" event={"ID":"dd0c4e7a-bff7-404c-aa9a-7473711fe5f2","Type":"ContainerDied","Data":"6cc1dad7d4d61fe3cdfddcd6fad4cdf3403b9fcfe739946d90a7684cb86628cf"} Apr 16 14:20:38.877029 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.876945 2572 scope.go:117] "RemoveContainer" containerID="b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c" Apr 16 14:20:38.877029 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.876899 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg" Apr 16 14:20:38.886316 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.886296 2572 scope.go:117] "RemoveContainer" containerID="b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c" Apr 16 14:20:38.886663 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:20:38.886642 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c\": container with ID starting with b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c not found: ID does not exist" containerID="b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c" Apr 16 14:20:38.886737 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.886672 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c"} err="failed to get container status \"b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c\": rpc error: code = NotFound desc = could not find container \"b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c\": container with ID starting with b980b259ddeb81751c0d4c6ce3e249cd91e369c0ffc52cdd32d7737889cf3c3c not found: ID does not exist" Apr 16 14:20:38.897530 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.897503 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg"] Apr 16 14:20:38.901156 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:38.901132 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a84b0-6cc45b7f74-r64lg"] Apr 16 14:20:39.902684 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:39.902651 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" path="/var/lib/kubelet/pods/dd0c4e7a-bff7-404c-aa9a-7473711fe5f2/volumes" Apr 16 14:20:42.281956 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.281925 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4"] Apr 16 14:20:42.282434 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.282416 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" Apr 16 14:20:42.282523 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.282436 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" Apr 16 14:20:42.282523 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.282452 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" Apr 16 14:20:42.282523 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.282461 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" Apr 16 14:20:42.282672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.282538 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf2fc1d4-835e-497d-9beb-e2d613492a73" containerName="ensemble-graph-305b2" Apr 16 14:20:42.282672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.282559 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd0c4e7a-bff7-404c-aa9a-7473711fe5f2" containerName="sequence-graph-a84b0" Apr 16 14:20:42.286728 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.286708 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.288983 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.288963 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-911ed-kube-rbac-proxy-sar-config\"" Apr 16 14:20:42.289083 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.289011 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-911ed-serving-cert\"" Apr 16 14:20:42.289083 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.289023 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:20:42.289185 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.289082 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4rnlz\"" Apr 16 14:20:42.294040 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.294015 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4"] Apr 16 14:20:42.335698 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.335677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b369d698-7d8c-4228-ba99-ef30375cff2a-openshift-service-ca-bundle\") pod \"ensemble-graph-911ed-78995f655-sjdp4\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.335791 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.335715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b369d698-7d8c-4228-ba99-ef30375cff2a-proxy-tls\") pod \"ensemble-graph-911ed-78995f655-sjdp4\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.436978 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.436945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b369d698-7d8c-4228-ba99-ef30375cff2a-openshift-service-ca-bundle\") pod \"ensemble-graph-911ed-78995f655-sjdp4\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.437113 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.436993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b369d698-7d8c-4228-ba99-ef30375cff2a-proxy-tls\") pod \"ensemble-graph-911ed-78995f655-sjdp4\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.437700 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.437680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b369d698-7d8c-4228-ba99-ef30375cff2a-openshift-service-ca-bundle\") pod \"ensemble-graph-911ed-78995f655-sjdp4\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.439518 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.439498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b369d698-7d8c-4228-ba99-ef30375cff2a-proxy-tls\") pod \"ensemble-graph-911ed-78995f655-sjdp4\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.604548 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.604475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.723665 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.723641 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4"] Apr 16 14:20:42.726111 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:20:42.726082 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb369d698_7d8c_4228_ba99_ef30375cff2a.slice/crio-512ba4624f63d501c427cc13c144525c0575403cc64161989022970b5eefb9eb WatchSource:0}: Error finding container 512ba4624f63d501c427cc13c144525c0575403cc64161989022970b5eefb9eb: Status 404 returned error can't find the container with id 512ba4624f63d501c427cc13c144525c0575403cc64161989022970b5eefb9eb Apr 16 14:20:42.891899 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.891826 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" event={"ID":"b369d698-7d8c-4228-ba99-ef30375cff2a","Type":"ContainerStarted","Data":"224c28ebefb797027e0e1ef3283bde6bf26bcb531416b81532829ff63e78b3f8"} Apr 16 14:20:42.891899 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.891864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" event={"ID":"b369d698-7d8c-4228-ba99-ef30375cff2a","Type":"ContainerStarted","Data":"512ba4624f63d501c427cc13c144525c0575403cc64161989022970b5eefb9eb"} Apr 16 14:20:42.892046 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.891937 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:20:42.907518 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:42.907481 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podStartSLOduration=0.907465118 podStartE2EDuration="907.465118ms" podCreationTimestamp="2026-04-16 14:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:42.906758265 +0000 UTC m=+1271.585265154" watchObservedRunningTime="2026-04-16 14:20:42.907465118 +0000 UTC m=+1271.585972000" Apr 16 14:20:48.901601 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:20:48.901566 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:21:18.054528 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.054498 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk"] Apr 16 14:21:18.058089 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.058068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.060478 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.060456 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-009d9-serving-cert\"" Apr 16 14:21:18.060478 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.060475 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-009d9-kube-rbac-proxy-sar-config\"" Apr 16 14:21:18.067420 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.067397 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk"] Apr 16 14:21:18.194310 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.194280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.194433 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.194320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/849364b2-f5e1-424e-b716-447daadb40de-openshift-service-ca-bundle\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.295139 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.295114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.295227 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.295151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/849364b2-f5e1-424e-b716-447daadb40de-openshift-service-ca-bundle\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.295276 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:21:18.295247 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-009d9-serving-cert: secret "sequence-graph-009d9-serving-cert" not found Apr 16 14:21:18.295355 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:21:18.295321 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls podName:849364b2-f5e1-424e-b716-447daadb40de nodeName:}" failed. No retries permitted until 2026-04-16 14:21:18.795305221 +0000 UTC m=+1307.473812088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls") pod "sequence-graph-009d9-764c944467-bghlk" (UID: "849364b2-f5e1-424e-b716-447daadb40de") : secret "sequence-graph-009d9-serving-cert" not found Apr 16 14:21:18.295855 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.295830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/849364b2-f5e1-424e-b716-447daadb40de-openshift-service-ca-bundle\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.798736 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.798707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.801088 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.801066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls\") pod \"sequence-graph-009d9-764c944467-bghlk\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:18.969108 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:18.969085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:19.089839 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:19.089817 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk"] Apr 16 14:21:19.092229 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:21:19.092202 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849364b2_f5e1_424e_b716_447daadb40de.slice/crio-a1a1e25a4759e4f33cccf88e99aba9f2f36d7edb9c5462b2611303141d9a5686 WatchSource:0}: Error finding container a1a1e25a4759e4f33cccf88e99aba9f2f36d7edb9c5462b2611303141d9a5686: Status 404 returned error can't find the container with id a1a1e25a4759e4f33cccf88e99aba9f2f36d7edb9c5462b2611303141d9a5686 Apr 16 14:21:20.005117 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:20.005074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" event={"ID":"849364b2-f5e1-424e-b716-447daadb40de","Type":"ContainerStarted","Data":"0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9"} Apr 16 14:21:20.005117 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:20.005115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" event={"ID":"849364b2-f5e1-424e-b716-447daadb40de","Type":"ContainerStarted","Data":"a1a1e25a4759e4f33cccf88e99aba9f2f36d7edb9c5462b2611303141d9a5686"} Apr 16 14:21:20.005483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:20.005146 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:21:20.022102 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:20.022047 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podStartSLOduration=2.022032854 podStartE2EDuration="2.022032854s" podCreationTimestamp="2026-04-16 14:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:21:20.019706379 +0000 UTC m=+1308.698213266" watchObservedRunningTime="2026-04-16 14:21:20.022032854 +0000 UTC m=+1308.700539742" Apr 16 14:21:26.016145 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:21:26.016117 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:28:56.992082 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:28:56.992052 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4"] Apr 16 14:28:56.994750 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:28:56.992292 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" containerID="cri-o://224c28ebefb797027e0e1ef3283bde6bf26bcb531416b81532829ff63e78b3f8" gracePeriod=30 Apr 16 14:28:58.899184 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:28:58.899145 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:03.899185 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:03.899137 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:08.899484 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:08.899434 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:08.900026 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:08.899543 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:29:13.898979 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:13.898930 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:18.898655 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:18.898566 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:23.899774 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:23.899688 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:27.468060 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.467977 2572 generic.go:358] "Generic (PLEG): container finished" podID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerID="224c28ebefb797027e0e1ef3283bde6bf26bcb531416b81532829ff63e78b3f8" exitCode=0 Apr 16 14:29:27.468060 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.468039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" event={"ID":"b369d698-7d8c-4228-ba99-ef30375cff2a","Type":"ContainerDied","Data":"224c28ebefb797027e0e1ef3283bde6bf26bcb531416b81532829ff63e78b3f8"} Apr 16 14:29:27.631527 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.631507 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:29:27.787304 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.787272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b369d698-7d8c-4228-ba99-ef30375cff2a-openshift-service-ca-bundle\") pod \"b369d698-7d8c-4228-ba99-ef30375cff2a\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " Apr 16 14:29:27.787453 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.787385 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b369d698-7d8c-4228-ba99-ef30375cff2a-proxy-tls\") pod \"b369d698-7d8c-4228-ba99-ef30375cff2a\" (UID: \"b369d698-7d8c-4228-ba99-ef30375cff2a\") " Apr 16 14:29:27.787647 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.787624 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b369d698-7d8c-4228-ba99-ef30375cff2a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b369d698-7d8c-4228-ba99-ef30375cff2a" (UID: "b369d698-7d8c-4228-ba99-ef30375cff2a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:29:27.789443 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.789414 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b369d698-7d8c-4228-ba99-ef30375cff2a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b369d698-7d8c-4228-ba99-ef30375cff2a" (UID: "b369d698-7d8c-4228-ba99-ef30375cff2a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:29:27.888175 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.888152 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b369d698-7d8c-4228-ba99-ef30375cff2a-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:29:27.888175 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:27.888177 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b369d698-7d8c-4228-ba99-ef30375cff2a-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:29:28.472041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:28.472009 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" event={"ID":"b369d698-7d8c-4228-ba99-ef30375cff2a","Type":"ContainerDied","Data":"512ba4624f63d501c427cc13c144525c0575403cc64161989022970b5eefb9eb"} Apr 16 14:29:28.472041 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:28.472032 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4" Apr 16 14:29:28.472546 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:28.472047 2572 scope.go:117] "RemoveContainer" containerID="224c28ebefb797027e0e1ef3283bde6bf26bcb531416b81532829ff63e78b3f8" Apr 16 14:29:28.489323 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:28.489298 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4"] Apr 16 14:29:28.495186 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:28.495165 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-911ed-78995f655-sjdp4"] Apr 16 14:29:29.902121 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:29.902088 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" path="/var/lib/kubelet/pods/b369d698-7d8c-4228-ba99-ef30375cff2a/volumes" Apr 16 14:29:32.693706 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:32.693676 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk"] Apr 16 14:29:32.694149 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:32.693961 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" containerID="cri-o://0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9" gracePeriod=30 Apr 16 14:29:36.013477 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:36.013434 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:41.013442 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:41.013390 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:46.013154 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:46.013108 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:46.013658 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:46.013233 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:29:51.012619 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:51.012577 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:29:56.012541 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:29:56.012487 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:01.012700 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:01.012655 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:02.839803 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:02.839784 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:30:03.038535 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.038514 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls\") pod \"849364b2-f5e1-424e-b716-447daadb40de\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " Apr 16 14:30:03.038665 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.038547 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/849364b2-f5e1-424e-b716-447daadb40de-openshift-service-ca-bundle\") pod \"849364b2-f5e1-424e-b716-447daadb40de\" (UID: \"849364b2-f5e1-424e-b716-447daadb40de\") " Apr 16 14:30:03.038954 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.038930 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849364b2-f5e1-424e-b716-447daadb40de-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "849364b2-f5e1-424e-b716-447daadb40de" (UID: "849364b2-f5e1-424e-b716-447daadb40de"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:30:03.040675 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.040655 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "849364b2-f5e1-424e-b716-447daadb40de" (UID: "849364b2-f5e1-424e-b716-447daadb40de"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:30:03.139210 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.139188 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/849364b2-f5e1-424e-b716-447daadb40de-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:30:03.139210 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.139209 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/849364b2-f5e1-424e-b716-447daadb40de-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:30:03.575849 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.575815 2572 generic.go:358] "Generic (PLEG): container finished" podID="849364b2-f5e1-424e-b716-447daadb40de" containerID="0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9" exitCode=0 Apr 16 14:30:03.576000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.575868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" event={"ID":"849364b2-f5e1-424e-b716-447daadb40de","Type":"ContainerDied","Data":"0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9"} Apr 16 14:30:03.576000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.575876 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" Apr 16 14:30:03.576000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.575895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk" event={"ID":"849364b2-f5e1-424e-b716-447daadb40de","Type":"ContainerDied","Data":"a1a1e25a4759e4f33cccf88e99aba9f2f36d7edb9c5462b2611303141d9a5686"} Apr 16 14:30:03.576000 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.575909 2572 scope.go:117] "RemoveContainer" containerID="0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9" Apr 16 14:30:03.590076 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.590059 2572 scope.go:117] "RemoveContainer" containerID="0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9" Apr 16 14:30:03.590303 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:03.590286 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9\": container with ID starting with 0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9 not found: ID does not exist" containerID="0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9" Apr 16 14:30:03.590369 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.590312 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9"} err="failed to get container status \"0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9\": rpc error: code = NotFound desc = could not find container \"0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9\": container with ID starting with 0f8092a75193e92d496c13d4a044b165a6e2c2a3868ef3fb150622dba6ced4d9 not found: ID does not exist" Apr 16 14:30:03.599271 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.599248 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk"] Apr 16 14:30:03.604686 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.604665 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-009d9-764c944467-bghlk"] Apr 16 14:30:03.902738 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:03.902676 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849364b2-f5e1-424e-b716-447daadb40de" path="/var/lib/kubelet/pods/849364b2-f5e1-424e-b716-447daadb40de/volumes" Apr 16 14:30:07.227894 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.227819 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m"] Apr 16 14:30:07.228254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.228148 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" Apr 16 14:30:07.228254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.228160 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" Apr 16 14:30:07.228254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.228172 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" Apr 16 14:30:07.228254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.228179 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" Apr 16 14:30:07.228254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.228250 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="849364b2-f5e1-424e-b716-447daadb40de" containerName="sequence-graph-009d9" Apr 16 14:30:07.228447 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.228262 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b369d698-7d8c-4228-ba99-ef30375cff2a" containerName="ensemble-graph-911ed" Apr 16 14:30:07.232475 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.232454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.234813 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.234791 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-0b236-kube-rbac-proxy-sar-config\"" Apr 16 14:30:07.234973 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.234957 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-0b236-serving-cert\"" Apr 16 14:30:07.235051 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.235017 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:30:07.235051 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.235036 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4rnlz\"" Apr 16 14:30:07.239440 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.239416 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m"] Apr 16 14:30:07.266422 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.266393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.266514 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.266447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48e9aaa-8bdd-48b2-9692-22359a990e15-openshift-service-ca-bundle\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.366930 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.366907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.367034 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.366962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48e9aaa-8bdd-48b2-9692-22359a990e15-openshift-service-ca-bundle\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.367104 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:07.367060 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-0b236-serving-cert: secret "splitter-graph-0b236-serving-cert" not found Apr 16 14:30:07.367160 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:07.367148 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls podName:d48e9aaa-8bdd-48b2-9692-22359a990e15 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:07.867126451 +0000 UTC m=+1836.545633324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls") pod "splitter-graph-0b236-856d54479f-8q95m" (UID: "d48e9aaa-8bdd-48b2-9692-22359a990e15") : secret "splitter-graph-0b236-serving-cert" not found Apr 16 14:30:07.367520 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.367503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48e9aaa-8bdd-48b2-9692-22359a990e15-openshift-service-ca-bundle\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.870106 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.870077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:07.872533 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:07.872515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls\") pod \"splitter-graph-0b236-856d54479f-8q95m\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:08.144279 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.144222 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:08.264642 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.264613 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m"] Apr 16 14:30:08.267983 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:30:08.267956 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-31be3994bd273eb36c02b4cbbb5997453ade1006f35d1440b099fd497af820be WatchSource:0}: Error finding container 31be3994bd273eb36c02b4cbbb5997453ade1006f35d1440b099fd497af820be: Status 404 returned error can't find the container with id 31be3994bd273eb36c02b4cbbb5997453ade1006f35d1440b099fd497af820be Apr 16 14:30:08.270001 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.269982 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:30:08.596519 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.596485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" event={"ID":"d48e9aaa-8bdd-48b2-9692-22359a990e15","Type":"ContainerStarted","Data":"ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e"} Apr 16 14:30:08.596519 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.596521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" event={"ID":"d48e9aaa-8bdd-48b2-9692-22359a990e15","Type":"ContainerStarted","Data":"31be3994bd273eb36c02b4cbbb5997453ade1006f35d1440b099fd497af820be"} Apr 16 14:30:08.596730 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.596606 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:08.620443 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:08.620396 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podStartSLOduration=1.620381683 podStartE2EDuration="1.620381683s" podCreationTimestamp="2026-04-16 14:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:08.612849295 +0000 UTC m=+1837.291356183" watchObservedRunningTime="2026-04-16 14:30:08.620381683 +0000 UTC m=+1837.298888571" Apr 16 14:30:14.605059 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:14.605031 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:17.301454 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:17.301421 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m"] Apr 16 14:30:17.301890 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:17.301708 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" containerID="cri-o://ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e" gracePeriod=30 Apr 16 14:30:19.603133 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:19.603099 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:24.603952 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:24.603901 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:29.603754 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:29.603717 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:29.604199 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:29.603829 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:34.603833 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:34.603796 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:39.604196 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:39.604152 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:42.914963 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:42.914933 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr"] Apr 16 14:30:42.918591 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:42.918569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:42.921118 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:42.921097 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-3ca33-kube-rbac-proxy-sar-config\"" Apr 16 14:30:42.921241 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:42.921222 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-3ca33-serving-cert\"" Apr 16 14:30:42.927475 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:42.927451 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr"] Apr 16 14:30:43.004108 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.004083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fb10500-b6fe-4577-9c8c-1e994177980d-openshift-service-ca-bundle\") pod \"switch-graph-3ca33-66cd5cfb9d-fmlpr\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.004226 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.004116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb10500-b6fe-4577-9c8c-1e994177980d-proxy-tls\") pod \"switch-graph-3ca33-66cd5cfb9d-fmlpr\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.105088 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.105051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fb10500-b6fe-4577-9c8c-1e994177980d-openshift-service-ca-bundle\") pod \"switch-graph-3ca33-66cd5cfb9d-fmlpr\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.105197 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.105127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb10500-b6fe-4577-9c8c-1e994177980d-proxy-tls\") pod \"switch-graph-3ca33-66cd5cfb9d-fmlpr\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.105722 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.105699 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fb10500-b6fe-4577-9c8c-1e994177980d-openshift-service-ca-bundle\") pod \"switch-graph-3ca33-66cd5cfb9d-fmlpr\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.107530 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.107507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb10500-b6fe-4577-9c8c-1e994177980d-proxy-tls\") pod \"switch-graph-3ca33-66cd5cfb9d-fmlpr\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.229456 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.229399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.346245 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.346224 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr"] Apr 16 14:30:43.702403 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.702373 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" event={"ID":"0fb10500-b6fe-4577-9c8c-1e994177980d","Type":"ContainerStarted","Data":"6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3"} Apr 16 14:30:43.702557 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.702410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" event={"ID":"0fb10500-b6fe-4577-9c8c-1e994177980d","Type":"ContainerStarted","Data":"58406b40c4737ac6e69f996d839b91a5840431445a381dd77544d66b2771f4b0"} Apr 16 14:30:43.702557 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.702507 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:30:43.719016 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:43.718978 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podStartSLOduration=1.718964133 podStartE2EDuration="1.718964133s" podCreationTimestamp="2026-04-16 14:30:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:43.717083819 +0000 UTC m=+1872.395590702" watchObservedRunningTime="2026-04-16 14:30:43.718964133 +0000 UTC m=+1872.397471021" Apr 16 14:30:44.603706 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:44.603668 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:30:47.325553 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:47.325515 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-31be3994bd273eb36c02b4cbbb5997453ade1006f35d1440b099fd497af820be\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-conmon-ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:30:47.325898 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:47.325596 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-conmon-ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:30:47.325898 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:47.325676 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-conmon-ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:30:47.325898 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:47.325750 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48e9aaa_8bdd_48b2_9692_22359a990e15.slice/crio-ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:30:47.447536 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.447516 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:47.541069 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.541041 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls\") pod \"d48e9aaa-8bdd-48b2-9692-22359a990e15\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " Apr 16 14:30:47.541195 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.541086 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48e9aaa-8bdd-48b2-9692-22359a990e15-openshift-service-ca-bundle\") pod \"d48e9aaa-8bdd-48b2-9692-22359a990e15\" (UID: \"d48e9aaa-8bdd-48b2-9692-22359a990e15\") " Apr 16 14:30:47.541455 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.541431 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48e9aaa-8bdd-48b2-9692-22359a990e15-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d48e9aaa-8bdd-48b2-9692-22359a990e15" (UID: "d48e9aaa-8bdd-48b2-9692-22359a990e15"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:30:47.543130 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.543112 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d48e9aaa-8bdd-48b2-9692-22359a990e15" (UID: "d48e9aaa-8bdd-48b2-9692-22359a990e15"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:30:47.642200 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.642148 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d48e9aaa-8bdd-48b2-9692-22359a990e15-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:30:47.642200 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.642170 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48e9aaa-8bdd-48b2-9692-22359a990e15-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:30:47.714581 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.714553 2572 generic.go:358] "Generic (PLEG): container finished" podID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerID="ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e" exitCode=0 Apr 16 14:30:47.714672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.714609 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" Apr 16 14:30:47.714672 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.714643 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" event={"ID":"d48e9aaa-8bdd-48b2-9692-22359a990e15","Type":"ContainerDied","Data":"ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e"} Apr 16 14:30:47.714747 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.714681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m" event={"ID":"d48e9aaa-8bdd-48b2-9692-22359a990e15","Type":"ContainerDied","Data":"31be3994bd273eb36c02b4cbbb5997453ade1006f35d1440b099fd497af820be"} Apr 16 14:30:47.714747 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.714706 2572 scope.go:117] "RemoveContainer" containerID="ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e" Apr 16 14:30:47.722489 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.722464 2572 scope.go:117] "RemoveContainer" containerID="ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e" Apr 16 14:30:47.722737 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:30:47.722716 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e\": container with ID starting with ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e not found: ID does not exist" containerID="ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e" Apr 16 14:30:47.722804 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.722748 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e"} err="failed to get container status \"ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e\": rpc error: code = NotFound desc = could not find container \"ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e\": container with ID starting with ff4c94e841c426a3e15ca58944e669cbf8f39075b09e07574b3a2194c92f808e not found: ID does not exist" Apr 16 14:30:47.734292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.734273 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m"] Apr 16 14:30:47.738106 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.738085 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0b236-856d54479f-8q95m"] Apr 16 14:30:47.902598 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:47.902547 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" path="/var/lib/kubelet/pods/d48e9aaa-8bdd-48b2-9692-22359a990e15/volumes" Apr 16 14:30:49.710872 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:30:49.710837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:31:27.521080 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.521049 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z"] Apr 16 14:31:27.521511 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.521403 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" Apr 16 14:31:27.521511 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.521420 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" Apr 16 14:31:27.521585 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.521526 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d48e9aaa-8bdd-48b2-9692-22359a990e15" containerName="splitter-graph-0b236" Apr 16 14:31:27.524441 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.524423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:27.527026 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.527001 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a940c-kube-rbac-proxy-sar-config\"" Apr 16 14:31:27.527240 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.527221 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a940c-serving-cert\"" Apr 16 14:31:27.540095 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.540075 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z"] Apr 16 14:31:27.639120 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.639094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:27.639221 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.639167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9506c17c-30ff-4776-a3c2-308e6dfabdee-openshift-service-ca-bundle\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:27.740045 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.740020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9506c17c-30ff-4776-a3c2-308e6dfabdee-openshift-service-ca-bundle\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:27.740146 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.740058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:27.740190 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:31:27.740159 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-a940c-serving-cert: secret "splitter-graph-a940c-serving-cert" not found Apr 16 14:31:27.740235 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:31:27.740206 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls podName:9506c17c-30ff-4776-a3c2-308e6dfabdee nodeName:}" failed. No retries permitted until 2026-04-16 14:31:28.24019117 +0000 UTC m=+1916.918698036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls") pod "splitter-graph-a940c-74c5f5f874-5gw2z" (UID: "9506c17c-30ff-4776-a3c2-308e6dfabdee") : secret "splitter-graph-a940c-serving-cert" not found Apr 16 14:31:27.740618 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:27.740601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9506c17c-30ff-4776-a3c2-308e6dfabdee-openshift-service-ca-bundle\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:28.243808 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.243772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:28.246397 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.246365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls\") pod \"splitter-graph-a940c-74c5f5f874-5gw2z\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:28.433815 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.433793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:28.555010 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.554987 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z"] Apr 16 14:31:28.557221 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:31:28.557194 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9506c17c_30ff_4776_a3c2_308e6dfabdee.slice/crio-db9cd53ae9bce2615241eb23355a3269e9342a81b118d59e95a778b95c72da93 WatchSource:0}: Error finding container db9cd53ae9bce2615241eb23355a3269e9342a81b118d59e95a778b95c72da93: Status 404 returned error can't find the container with id db9cd53ae9bce2615241eb23355a3269e9342a81b118d59e95a778b95c72da93 Apr 16 14:31:28.852016 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.851942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" event={"ID":"9506c17c-30ff-4776-a3c2-308e6dfabdee","Type":"ContainerStarted","Data":"9b7789ba0e998a81e7d7bb2a38cd27848d44e277cfe20d3d5405f61853380d3b"} Apr 16 14:31:28.852016 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.851976 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" event={"ID":"9506c17c-30ff-4776-a3c2-308e6dfabdee","Type":"ContainerStarted","Data":"db9cd53ae9bce2615241eb23355a3269e9342a81b118d59e95a778b95c72da93"} Apr 16 14:31:28.852169 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.852072 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:31:28.869376 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:28.869312 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podStartSLOduration=1.869299899 podStartE2EDuration="1.869299899s" podCreationTimestamp="2026-04-16 14:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:31:28.869230598 +0000 UTC m=+1917.547737483" watchObservedRunningTime="2026-04-16 14:31:28.869299899 +0000 UTC m=+1917.547806786" Apr 16 14:31:34.860161 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:31:34.860129 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:39:42.283110 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:42.283077 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z"] Apr 16 14:39:42.285501 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:42.283325 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" containerID="cri-o://9b7789ba0e998a81e7d7bb2a38cd27848d44e277cfe20d3d5405f61853380d3b" gracePeriod=30 Apr 16 14:39:44.859462 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:44.859416 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:39:49.859941 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:49.859848 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:39:54.859794 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:54.859745 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:39:54.860251 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:54.859858 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:39:59.859173 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:39:59.859127 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:40:04.859486 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:04.859446 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:40:09.859389 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:09.859317 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:40:12.404628 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.404600 2572 generic.go:358] "Generic (PLEG): container finished" podID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerID="9b7789ba0e998a81e7d7bb2a38cd27848d44e277cfe20d3d5405f61853380d3b" exitCode=0 Apr 16 14:40:12.404945 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.404678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" event={"ID":"9506c17c-30ff-4776-a3c2-308e6dfabdee","Type":"ContainerDied","Data":"9b7789ba0e998a81e7d7bb2a38cd27848d44e277cfe20d3d5405f61853380d3b"} Apr 16 14:40:12.418622 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.418602 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:40:12.514292 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.514264 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls\") pod \"9506c17c-30ff-4776-a3c2-308e6dfabdee\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " Apr 16 14:40:12.514454 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.514390 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9506c17c-30ff-4776-a3c2-308e6dfabdee-openshift-service-ca-bundle\") pod \"9506c17c-30ff-4776-a3c2-308e6dfabdee\" (UID: \"9506c17c-30ff-4776-a3c2-308e6dfabdee\") " Apr 16 14:40:12.514727 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.514695 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9506c17c-30ff-4776-a3c2-308e6dfabdee-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9506c17c-30ff-4776-a3c2-308e6dfabdee" (UID: "9506c17c-30ff-4776-a3c2-308e6dfabdee"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:40:12.516404 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.516352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9506c17c-30ff-4776-a3c2-308e6dfabdee" (UID: "9506c17c-30ff-4776-a3c2-308e6dfabdee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:40:12.615941 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.615880 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9506c17c-30ff-4776-a3c2-308e6dfabdee-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:40:12.615941 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:12.615907 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9506c17c-30ff-4776-a3c2-308e6dfabdee-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:40:13.408431 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:13.408398 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" event={"ID":"9506c17c-30ff-4776-a3c2-308e6dfabdee","Type":"ContainerDied","Data":"db9cd53ae9bce2615241eb23355a3269e9342a81b118d59e95a778b95c72da93"} Apr 16 14:40:13.408854 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:13.408437 2572 scope.go:117] "RemoveContainer" containerID="9b7789ba0e998a81e7d7bb2a38cd27848d44e277cfe20d3d5405f61853380d3b" Apr 16 14:40:13.408854 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:13.408445 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z" Apr 16 14:40:13.428852 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:13.428824 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z"] Apr 16 14:40:13.432646 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:13.432622 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a940c-74c5f5f874-5gw2z"] Apr 16 14:40:13.903185 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:40:13.903154 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" path="/var/lib/kubelet/pods/9506c17c-30ff-4776-a3c2-308e6dfabdee/volumes" Apr 16 14:47:02.106119 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:02.106085 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr"] Apr 16 14:47:02.108648 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:02.106391 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" containerID="cri-o://6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3" gracePeriod=30 Apr 16 14:47:03.574258 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.574223 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-krs5l/must-gather-xrmpp"] Apr 16 14:47:03.574601 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.574581 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" Apr 16 14:47:03.574601 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.574594 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" Apr 16 14:47:03.574681 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.574656 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9506c17c-30ff-4776-a3c2-308e6dfabdee" containerName="splitter-graph-a940c" Apr 16 14:47:03.577491 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.577474 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.579902 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.579873 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-krs5l\"/\"default-dockercfg-ps89z\"" Apr 16 14:47:03.581010 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.580980 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-krs5l\"/\"openshift-service-ca.crt\"" Apr 16 14:47:03.582796 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.581224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-krs5l\"/\"kube-root-ca.crt\"" Apr 16 14:47:03.593720 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.593698 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-krs5l/must-gather-xrmpp"] Apr 16 14:47:03.624103 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.624077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/acab4279-064a-416f-965a-4c03315cd947-must-gather-output\") pod \"must-gather-xrmpp\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.624209 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.624118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckktq\" (UniqueName: \"kubernetes.io/projected/acab4279-064a-416f-965a-4c03315cd947-kube-api-access-ckktq\") pod \"must-gather-xrmpp\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.725183 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.725155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckktq\" (UniqueName: \"kubernetes.io/projected/acab4279-064a-416f-965a-4c03315cd947-kube-api-access-ckktq\") pod \"must-gather-xrmpp\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.725273 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.725230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/acab4279-064a-416f-965a-4c03315cd947-must-gather-output\") pod \"must-gather-xrmpp\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.725500 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.725487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/acab4279-064a-416f-965a-4c03315cd947-must-gather-output\") pod \"must-gather-xrmpp\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.734060 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.734031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckktq\" (UniqueName: \"kubernetes.io/projected/acab4279-064a-416f-965a-4c03315cd947-kube-api-access-ckktq\") pod \"must-gather-xrmpp\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:03.904933 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:03.904884 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:04.022183 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:04.022158 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-krs5l/must-gather-xrmpp"] Apr 16 14:47:04.024599 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:47:04.024570 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacab4279_064a_416f_965a_4c03315cd947.slice/crio-9905edbc449a8556d891706530767a8d90765b58352e487a118d4e97aed298d5 WatchSource:0}: Error finding container 9905edbc449a8556d891706530767a8d90765b58352e487a118d4e97aed298d5: Status 404 returned error can't find the container with id 9905edbc449a8556d891706530767a8d90765b58352e487a118d4e97aed298d5 Apr 16 14:47:04.026377 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:04.026359 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:47:04.622731 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:04.622671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krs5l/must-gather-xrmpp" event={"ID":"acab4279-064a-416f-965a-4c03315cd947","Type":"ContainerStarted","Data":"9905edbc449a8556d891706530767a8d90765b58352e487a118d4e97aed298d5"} Apr 16 14:47:04.710607 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:04.710566 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:47:08.637668 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:08.637597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krs5l/must-gather-xrmpp" event={"ID":"acab4279-064a-416f-965a-4c03315cd947","Type":"ContainerStarted","Data":"b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf"} Apr 16 14:47:08.637668 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:08.637670 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krs5l/must-gather-xrmpp" event={"ID":"acab4279-064a-416f-965a-4c03315cd947","Type":"ContainerStarted","Data":"6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46"} Apr 16 14:47:08.651933 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:08.651882 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-krs5l/must-gather-xrmpp" podStartSLOduration=1.745300257 podStartE2EDuration="5.651863579s" podCreationTimestamp="2026-04-16 14:47:03 +0000 UTC" firstStartedPulling="2026-04-16 14:47:04.026542982 +0000 UTC m=+2852.705049852" lastFinishedPulling="2026-04-16 14:47:07.933106303 +0000 UTC m=+2856.611613174" observedRunningTime="2026-04-16 14:47:08.651484886 +0000 UTC m=+2857.329991813" watchObservedRunningTime="2026-04-16 14:47:08.651863579 +0000 UTC m=+2857.330370469" Apr 16 14:47:09.709483 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:09.709447 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:47:14.709157 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:14.709119 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:47:14.709584 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:14.709236 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:47:16.533559 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:16.533467 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:17.262866 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:17.262838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:17.994190 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:17.994159 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:18.693579 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:18.693539 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:19.398018 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:19.397987 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:19.710829 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:19.710738 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:47:20.093103 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:20.093069 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:20.777727 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:20.777691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:21.478667 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:21.478639 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:22.180640 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:22.180544 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:22.883475 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:22.883413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:23.582732 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:23.582699 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:24.279845 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:24.279814 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-3ca33-66cd5cfb9d-fmlpr_0fb10500-b6fe-4577-9c8c-1e994177980d/switch-graph-3ca33/0.log" Apr 16 14:47:24.711233 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:24.711155 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:47:26.694138 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:26.694112 2572 generic.go:358] "Generic (PLEG): container finished" podID="acab4279-064a-416f-965a-4c03315cd947" containerID="6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46" exitCode=0 Apr 16 14:47:26.694502 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:26.694202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krs5l/must-gather-xrmpp" event={"ID":"acab4279-064a-416f-965a-4c03315cd947","Type":"ContainerDied","Data":"6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46"} Apr 16 14:47:26.694547 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:26.694535 2572 scope.go:117] "RemoveContainer" containerID="6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46" Apr 16 14:47:27.672433 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:27.672400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-krs5l_must-gather-xrmpp_acab4279-064a-416f-965a-4c03315cd947/gather/0.log" Apr 16 14:47:29.709626 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:29.709589 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:47:30.793305 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:30.793274 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-444dx_38192d70-e6e7-4372-ae09-7af27ef748e6/global-pull-secret-syncer/0.log" Apr 16 14:47:30.970943 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:30.970914 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-srjct_05bbc014-12fb-4a97-900d-ab870f220e6f/konnectivity-agent/0.log" Apr 16 14:47:31.030123 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:31.030090 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-129.ec2.internal_ed2e5e70f0ff42ca6d01355865582d97/haproxy/0.log" Apr 16 14:47:32.242460 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.242438 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:47:32.364023 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.363950 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb10500-b6fe-4577-9c8c-1e994177980d-proxy-tls\") pod \"0fb10500-b6fe-4577-9c8c-1e994177980d\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " Apr 16 14:47:32.364023 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.364007 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fb10500-b6fe-4577-9c8c-1e994177980d-openshift-service-ca-bundle\") pod \"0fb10500-b6fe-4577-9c8c-1e994177980d\" (UID: \"0fb10500-b6fe-4577-9c8c-1e994177980d\") " Apr 16 14:47:32.364372 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.364323 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fb10500-b6fe-4577-9c8c-1e994177980d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0fb10500-b6fe-4577-9c8c-1e994177980d" (UID: "0fb10500-b6fe-4577-9c8c-1e994177980d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:47:32.365962 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.365943 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb10500-b6fe-4577-9c8c-1e994177980d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0fb10500-b6fe-4577-9c8c-1e994177980d" (UID: "0fb10500-b6fe-4577-9c8c-1e994177980d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:47:32.465320 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.465293 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb10500-b6fe-4577-9c8c-1e994177980d-proxy-tls\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:47:32.465320 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.465316 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fb10500-b6fe-4577-9c8c-1e994177980d-openshift-service-ca-bundle\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:47:32.710214 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.710150 2572 generic.go:358] "Generic (PLEG): container finished" podID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerID="6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3" exitCode=0 Apr 16 14:47:32.710214 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.710190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" event={"ID":"0fb10500-b6fe-4577-9c8c-1e994177980d","Type":"ContainerDied","Data":"6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3"} Apr 16 14:47:32.710214 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.710210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" event={"ID":"0fb10500-b6fe-4577-9c8c-1e994177980d","Type":"ContainerDied","Data":"58406b40c4737ac6e69f996d839b91a5840431445a381dd77544d66b2771f4b0"} Apr 16 14:47:32.710429 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.710210 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr" Apr 16 14:47:32.710429 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.710229 2572 scope.go:117] "RemoveContainer" containerID="6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3" Apr 16 14:47:32.718143 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.718128 2572 scope.go:117] "RemoveContainer" containerID="6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3" Apr 16 14:47:32.718403 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:47:32.718384 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3\": container with ID starting with 6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3 not found: ID does not exist" containerID="6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3" Apr 16 14:47:32.718450 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.718411 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3"} err="failed to get container status \"6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3\": rpc error: code = NotFound desc = could not find container \"6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3\": container with ID starting with 6edb0fa4d6bf74e5a286ffcb619ccc256f62f2ca2e2bc5888825dcc82b72c5a3 not found: ID does not exist" Apr 16 14:47:32.729771 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.729750 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr"] Apr 16 14:47:32.735541 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:32.735522 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3ca33-66cd5cfb9d-fmlpr"] Apr 16 14:47:33.086938 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.086913 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-krs5l/must-gather-xrmpp"] Apr 16 14:47:33.087140 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.087104 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-krs5l/must-gather-xrmpp" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="copy" containerID="cri-o://b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf" gracePeriod=2 Apr 16 14:47:33.093055 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.093035 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-krs5l/must-gather-xrmpp"] Apr 16 14:47:33.311588 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.311567 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-krs5l_must-gather-xrmpp_acab4279-064a-416f-965a-4c03315cd947/copy/0.log" Apr 16 14:47:33.311914 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.311895 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:33.313980 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.313957 2572 status_manager.go:895] "Failed to get status for pod" podUID="acab4279-064a-416f-965a-4c03315cd947" pod="openshift-must-gather-krs5l/must-gather-xrmpp" err="pods \"must-gather-xrmpp\" is forbidden: User \"system:node:ip-10-0-128-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-krs5l\": no relationship found between node 'ip-10-0-128-129.ec2.internal' and this object" Apr 16 14:47:33.475198 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.475142 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/acab4279-064a-416f-965a-4c03315cd947-must-gather-output\") pod \"acab4279-064a-416f-965a-4c03315cd947\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " Apr 16 14:47:33.475198 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.475189 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckktq\" (UniqueName: \"kubernetes.io/projected/acab4279-064a-416f-965a-4c03315cd947-kube-api-access-ckktq\") pod \"acab4279-064a-416f-965a-4c03315cd947\" (UID: \"acab4279-064a-416f-965a-4c03315cd947\") " Apr 16 14:47:33.476660 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.476635 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acab4279-064a-416f-965a-4c03315cd947-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "acab4279-064a-416f-965a-4c03315cd947" (UID: "acab4279-064a-416f-965a-4c03315cd947"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:47:33.477288 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.477269 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acab4279-064a-416f-965a-4c03315cd947-kube-api-access-ckktq" (OuterVolumeSpecName: "kube-api-access-ckktq") pod "acab4279-064a-416f-965a-4c03315cd947" (UID: "acab4279-064a-416f-965a-4c03315cd947"). InnerVolumeSpecName "kube-api-access-ckktq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:47:33.575627 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.575605 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/acab4279-064a-416f-965a-4c03315cd947-must-gather-output\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:47:33.575627 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.575626 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ckktq\" (UniqueName: \"kubernetes.io/projected/acab4279-064a-416f-965a-4c03315cd947-kube-api-access-ckktq\") on node \"ip-10-0-128-129.ec2.internal\" DevicePath \"\"" Apr 16 14:47:33.715029 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.715011 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-krs5l_must-gather-xrmpp_acab4279-064a-416f-965a-4c03315cd947/copy/0.log" Apr 16 14:47:33.715441 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.715408 2572 generic.go:358] "Generic (PLEG): container finished" podID="acab4279-064a-416f-965a-4c03315cd947" containerID="b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf" exitCode=143 Apr 16 14:47:33.715532 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.715462 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krs5l/must-gather-xrmpp" Apr 16 14:47:33.715532 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.715478 2572 scope.go:117] "RemoveContainer" containerID="b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf" Apr 16 14:47:33.717780 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.717743 2572 status_manager.go:895] "Failed to get status for pod" podUID="acab4279-064a-416f-965a-4c03315cd947" pod="openshift-must-gather-krs5l/must-gather-xrmpp" err="pods \"must-gather-xrmpp\" is forbidden: User \"system:node:ip-10-0-128-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-krs5l\": no relationship found between node 'ip-10-0-128-129.ec2.internal' and this object" Apr 16 14:47:33.723792 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.723773 2572 scope.go:117] "RemoveContainer" containerID="6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46" Apr 16 14:47:33.726557 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.726513 2572 status_manager.go:895] "Failed to get status for pod" podUID="acab4279-064a-416f-965a-4c03315cd947" pod="openshift-must-gather-krs5l/must-gather-xrmpp" err="pods \"must-gather-xrmpp\" is forbidden: User \"system:node:ip-10-0-128-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-krs5l\": no relationship found between node 'ip-10-0-128-129.ec2.internal' and this object" Apr 16 14:47:33.735380 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.735364 2572 scope.go:117] "RemoveContainer" containerID="b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf" Apr 16 14:47:33.735620 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:47:33.735602 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf\": container with ID starting with b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf not found: ID does not exist" containerID="b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf" Apr 16 14:47:33.735675 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.735629 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf"} err="failed to get container status \"b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf\": rpc error: code = NotFound desc = could not find container \"b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf\": container with ID starting with b81a5959fe0bdd42ee271f5979eb8cea0e9fc488f5e3b4701104c61e4a7e40bf not found: ID does not exist" Apr 16 14:47:33.735675 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.735647 2572 scope.go:117] "RemoveContainer" containerID="6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46" Apr 16 14:47:33.735858 ip-10-0-128-129 kubenswrapper[2572]: E0416 14:47:33.735844 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46\": container with ID starting with 6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46 not found: ID does not exist" containerID="6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46" Apr 16 14:47:33.735895 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.735862 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46"} err="failed to get container status \"6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46\": rpc error: code = NotFound desc = could not find container \"6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46\": container with ID starting with 6f281e47e622da3dbd522e8b4b040e949114ee21701d05843dd7bddbe46cce46 not found: ID does not exist" Apr 16 14:47:33.902012 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.901988 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" path="/var/lib/kubelet/pods/0fb10500-b6fe-4577-9c8c-1e994177980d/volumes" Apr 16 14:47:33.902317 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:33.902304 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acab4279-064a-416f-965a-4c03315cd947" path="/var/lib/kubelet/pods/acab4279-064a-416f-965a-4c03315cd947/volumes" Apr 16 14:47:34.158559 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.158533 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/alertmanager/0.log" Apr 16 14:47:34.188968 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.188948 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/config-reloader/0.log" Apr 16 14:47:34.210394 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.210375 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/kube-rbac-proxy-web/0.log" Apr 16 14:47:34.234462 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.234440 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/kube-rbac-proxy/0.log" Apr 16 14:47:34.255071 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.255040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/kube-rbac-proxy-metric/0.log" Apr 16 14:47:34.279941 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.279923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/prom-label-proxy/0.log" Apr 16 14:47:34.300122 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.300107 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_43221ea1-101c-4ab1-874d-4da01f9e5d7a/init-config-reloader/0.log" Apr 16 14:47:34.365463 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.365443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-nht7j_bde45541-f349-4350-8270-52b3eaad1325/kube-state-metrics/0.log" Apr 16 14:47:34.384996 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.384957 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-nht7j_bde45541-f349-4350-8270-52b3eaad1325/kube-rbac-proxy-main/0.log" Apr 16 14:47:34.406606 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.406589 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-nht7j_bde45541-f349-4350-8270-52b3eaad1325/kube-rbac-proxy-self/0.log" Apr 16 14:47:34.433136 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.433060 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8769677c6-4zv5d_f20570e0-d258-40e7-94da-6a651303df3e/metrics-server/0.log" Apr 16 14:47:34.486895 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.486875 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9k4bn_c82633dc-21cb-4f43-9155-2073ed72f663/node-exporter/0.log" Apr 16 14:47:34.509847 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.509830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9k4bn_c82633dc-21cb-4f43-9155-2073ed72f663/kube-rbac-proxy/0.log" Apr 16 14:47:34.529719 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.529698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9k4bn_c82633dc-21cb-4f43-9155-2073ed72f663/init-textfile/0.log" Apr 16 14:47:34.710254 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.710176 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-wjkdh_bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd/kube-rbac-proxy-main/0.log" Apr 16 14:47:34.732957 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.732935 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-wjkdh_bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd/kube-rbac-proxy-self/0.log" Apr 16 14:47:34.757085 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.757067 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-wjkdh_bd5ecfdd-4dc7-4b6a-8fc1-e88fedd9bffd/openshift-state-metrics/0.log" Apr 16 14:47:34.808248 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.808220 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/prometheus/0.log" Apr 16 14:47:34.824484 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.824461 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/config-reloader/0.log" Apr 16 14:47:34.843522 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.843501 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/thanos-sidecar/0.log" Apr 16 14:47:34.863951 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.863929 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/kube-rbac-proxy-web/0.log" Apr 16 14:47:34.887083 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.887061 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/kube-rbac-proxy/0.log" Apr 16 14:47:34.906971 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.906942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/kube-rbac-proxy-thanos/0.log" Apr 16 14:47:34.928141 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:34.928114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e2577532-0f05-482a-8c50-c96bf606f03d/init-config-reloader/0.log" Apr 16 14:47:35.105974 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:35.105954 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76df994fb-g25kv_aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be/thanos-query/0.log" Apr 16 14:47:35.126099 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:35.126079 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76df994fb-g25kv_aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be/kube-rbac-proxy-web/0.log" Apr 16 14:47:35.146032 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:35.146004 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76df994fb-g25kv_aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be/kube-rbac-proxy/0.log" Apr 16 14:47:35.169051 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:35.169035 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76df994fb-g25kv_aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be/prom-label-proxy/0.log" Apr 16 14:47:35.192195 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:35.192174 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76df994fb-g25kv_aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be/kube-rbac-proxy-rules/0.log" Apr 16 14:47:35.211360 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:35.211344 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76df994fb-g25kv_aa0c8ff6-8266-47ba-aec5-dfa7c4b0d2be/kube-rbac-proxy-metrics/0.log" Apr 16 14:47:37.216794 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:37.216717 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69cf668bdb-nnwqf_7aa05115-7b88-45e3-b8e3-21cda1b7e7cf/console/0.log" Apr 16 14:47:38.076371 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076328 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c"] Apr 16 14:47:38.076666 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076653 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="gather" Apr 16 14:47:38.076716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076668 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="gather" Apr 16 14:47:38.076716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076682 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="copy" Apr 16 14:47:38.076716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076687 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="copy" Apr 16 14:47:38.076716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076704 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" Apr 16 14:47:38.076716 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076710 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" Apr 16 14:47:38.076860 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076753 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="copy" Apr 16 14:47:38.076860 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076763 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fb10500-b6fe-4577-9c8c-1e994177980d" containerName="switch-graph-3ca33" Apr 16 14:47:38.076860 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.076770 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="acab4279-064a-416f-965a-4c03315cd947" containerName="gather" Apr 16 14:47:38.079930 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.079909 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.082604 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.082583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wdzv9\"/\"default-dockercfg-hq28d\"" Apr 16 14:47:38.082695 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.082583 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wdzv9\"/\"openshift-service-ca.crt\"" Apr 16 14:47:38.083540 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.083528 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wdzv9\"/\"kube-root-ca.crt\"" Apr 16 14:47:38.089753 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.089734 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c"] Apr 16 14:47:38.207542 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.207510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-podres\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.207661 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.207549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxmp\" (UniqueName: \"kubernetes.io/projected/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-kube-api-access-rhxmp\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.207661 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.207575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-lib-modules\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.207661 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.207622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-proc\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.207808 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.207679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-sys\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.308763 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-proc\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-sys\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-podres\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxmp\" (UniqueName: \"kubernetes.io/projected/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-kube-api-access-rhxmp\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-proc\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-sys\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-lib-modules\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-podres\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.309090 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.308947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-lib-modules\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.316230 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.316214 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxmp\" (UniqueName: \"kubernetes.io/projected/f2b4cfb9-443d-4c73-bf61-e47fb76dfa29-kube-api-access-rhxmp\") pod \"perf-node-gather-daemonset-hrj5c\" (UID: \"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.366040 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.365979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n4t6t_360ea35e-bf48-4c5d-aeb1-dbe4c67646c3/dns/0.log" Apr 16 14:47:38.385262 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.385241 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n4t6t_360ea35e-bf48-4c5d-aeb1-dbe4c67646c3/kube-rbac-proxy/0.log" Apr 16 14:47:38.390224 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.390209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.405122 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.405105 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d7g97_8e7d9455-87de-4016-8826-39fe981aa729/dns-node-resolver/0.log" Apr 16 14:47:38.507944 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.507920 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c"] Apr 16 14:47:38.510514 ip-10-0-128-129 kubenswrapper[2572]: W0416 14:47:38.510487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf2b4cfb9_443d_4c73_bf61_e47fb76dfa29.slice/crio-fe64d6ffcaeb2762e9697f3ddba9c457d7f7cf53d3e67e12388fe2af84fc60f9 WatchSource:0}: Error finding container fe64d6ffcaeb2762e9697f3ddba9c457d7f7cf53d3e67e12388fe2af84fc60f9: Status 404 returned error can't find the container with id fe64d6ffcaeb2762e9697f3ddba9c457d7f7cf53d3e67e12388fe2af84fc60f9 Apr 16 14:47:38.731621 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.731558 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" event={"ID":"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29","Type":"ContainerStarted","Data":"1fed5ee8d2958e95b1b6ca74b096318b24d3198e0bbbb1c31d275d3fa0b3b1f9"} Apr 16 14:47:38.731621 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.731588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" event={"ID":"f2b4cfb9-443d-4c73-bf61-e47fb76dfa29","Type":"ContainerStarted","Data":"fe64d6ffcaeb2762e9697f3ddba9c457d7f7cf53d3e67e12388fe2af84fc60f9"} Apr 16 14:47:38.731754 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.731678 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:38.747701 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.747662 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" podStartSLOduration=0.74765079 podStartE2EDuration="747.65079ms" podCreationTimestamp="2026-04-16 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:47:38.747409284 +0000 UTC m=+2887.425916170" watchObservedRunningTime="2026-04-16 14:47:38.74765079 +0000 UTC m=+2887.426157678" Apr 16 14:47:38.894088 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:38.894063 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8nsr4_6f3e4196-7c1a-4fd5-b7a3-aac08e8eb660/node-ca/0.log" Apr 16 14:47:39.936228 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:39.936183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2nwg4_42fcb6b4-c71f-4846-9e64-95662201e229/serve-healthcheck-canary/0.log" Apr 16 14:47:40.435685 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:40.435638 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hq77c_6477641f-7f99-4033-bbb6-4840e371fbd2/kube-rbac-proxy/0.log" Apr 16 14:47:40.454318 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:40.454296 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hq77c_6477641f-7f99-4033-bbb6-4840e371fbd2/exporter/0.log" Apr 16 14:47:40.474538 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:40.474511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hq77c_6477641f-7f99-4033-bbb6-4840e371fbd2/extractor/0.log" Apr 16 14:47:42.514962 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:42.514922 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-75d667c7c4-z4dcx_7ff749b3-9a80-4bc3-80b4-276093aebd9d/manager/0.log" Apr 16 14:47:42.829487 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:42.829399 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-h6gqg_3a5e33de-9ecc-4979-bdd4-32c01cf17ac5/s3-init/0.log" Apr 16 14:47:42.856608 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:42.856583 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-9tc42_8939169a-f799-4397-b932-fc30821c51b2/seaweedfs/0.log" Apr 16 14:47:44.743604 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:44.743571 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-hrj5c" Apr 16 14:47:47.620995 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.620965 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/kube-multus-additional-cni-plugins/0.log" Apr 16 14:47:47.640437 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.640409 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/egress-router-binary-copy/0.log" Apr 16 14:47:47.660104 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.660080 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/cni-plugins/0.log" Apr 16 14:47:47.680686 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.680664 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/bond-cni-plugin/0.log" Apr 16 14:47:47.703398 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.703379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/routeoverride-cni/0.log" Apr 16 14:47:47.727073 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.727016 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/whereabouts-cni-bincopy/0.log" Apr 16 14:47:47.750665 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:47.750648 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghk8n_98c854f8-33e5-46ea-aa35-7026190215b7/whereabouts-cni/0.log" Apr 16 14:47:48.123988 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:48.123969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x62b2_42f7372f-60f8-484f-bdc7-063aea09785d/kube-multus/0.log" Apr 16 14:47:48.255153 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:48.255131 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7bhkl_59442fa9-d5a4-452c-bf14-f93d58af99dc/network-metrics-daemon/0.log" Apr 16 14:47:48.271002 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:48.270982 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7bhkl_59442fa9-d5a4-452c-bf14-f93d58af99dc/kube-rbac-proxy/0.log" Apr 16 14:47:49.001865 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.001840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/ovn-controller/0.log" Apr 16 14:47:49.063811 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.063789 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/ovn-acl-logging/0.log" Apr 16 14:47:49.084601 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.084580 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/kube-rbac-proxy-node/0.log" Apr 16 14:47:49.104764 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.104743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:47:49.123445 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.123430 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/northd/0.log" Apr 16 14:47:49.143270 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.143240 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/nbdb/0.log" Apr 16 14:47:49.164921 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.164862 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/sbdb/0.log" Apr 16 14:47:49.338792 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:49.338767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2vdts_bdeff3e6-46e4-45e1-a8f2-7934598cbfbd/ovnkube-controller/0.log" Apr 16 14:47:50.860635 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:50.860604 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bh7db_0020e0fe-4923-4ecf-86ba-90de98fb3649/network-check-target-container/0.log" Apr 16 14:47:51.796212 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:51.796177 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kx5cs_8e6892ba-2fcc-4246-87a3-cb11034c5167/iptables-alerter/0.log" Apr 16 14:47:52.468378 ip-10-0-128-129 kubenswrapper[2572]: I0416 14:47:52.468325 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vmp2w_ce2074e5-ffeb-4776-a271-517ad48e47e1/tuned/0.log"