Apr 16 18:09:34.970035 ip-10-0-142-228 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:09:35.363337 ip-10-0-142-228 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:35.363337 ip-10-0-142-228 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:09:35.363337 ip-10-0-142-228 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:35.363337 ip-10-0-142-228 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:09:35.363337 ip-10-0-142-228 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:35.366370 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.366275 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:09:35.370526 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370502 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:35.370526 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370523 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:35.370526 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370527 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:35.370526 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370530 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:35.370526 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370533 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370537 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370540 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370543 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370546 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370549 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370552 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370554 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370557 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370560 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370562 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370565 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370567 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370570 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370572 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370575 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370577 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370580 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370590 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370592 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:35.370749 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370595 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370597 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370600 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370603 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370605 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370608 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370611 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370614 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370616 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370619 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370621 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370624 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370627 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370629 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370632 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370635 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370638 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370640 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370643 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370645 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:35.371229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370648 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370651 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370654 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370657 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370659 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370662 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370665 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370669 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370674 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370677 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370680 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370698 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370703 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370707 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370710 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370713 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370716 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370718 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370722 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:35.371767 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370725 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370727 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370730 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370733 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370736 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370740 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370742 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370745 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370748 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370750 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370753 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370756 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370758 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370761 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370764 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370774 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370776 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370779 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370781 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370784 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:35.372229 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370787 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:35.372727 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370789 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:35.372727 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.370793 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:35.372830 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372817 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:35.372830 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372829 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372832 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372836 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372839 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372842 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372845 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372848 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372851 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372853 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372856 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372859 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372862 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372865 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372868 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372870 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372873 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372875 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372878 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372882 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372885 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:35.372882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372888 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372891 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372894 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372898 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372900 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372903 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372905 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372908 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372910 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372913 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372915 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372918 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372921 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372923 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372926 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372928 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372931 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372933 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372936 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372938 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:35.373362 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372941 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372943 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372945 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372948 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372951 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372954 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372956 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372959 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372961 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372964 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372966 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372969 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372972 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372974 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372978 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372981 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372984 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372987 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372990 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372992 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:35.373911 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372995 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.372997 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373000 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373003 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373006 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373008 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373011 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373013 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373016 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373018 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373021 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373025 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373027 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373030 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373033 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373035 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373038 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373041 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373043 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:35.374432 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373046 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373050 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373053 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373056 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373060 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373062 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373134 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373142 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373150 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373154 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373160 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373163 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373168 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373173 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373176 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373179 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373183 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373186 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373189 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373193 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373197 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373202 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373207 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 18:09:35.374915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373211 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373215 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373244 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373248 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373252 2571 flags.go:64] FLAG: --config-dir="" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373255 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373259 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373264 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373267 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373270 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373274 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373277 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373280 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373283 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373286 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373290 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373295 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373298 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373301 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373304 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373308 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373311 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373317 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373320 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373323 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:09:35.375480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373326 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373329 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373333 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373336 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373339 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373342 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373345 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373348 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373351 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373354 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373357 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373360 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373363 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373367 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373370 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373373 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373376 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373380 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373383 2571 flags.go:64] FLAG: --help="false" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373386 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373389 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373392 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373396 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373399 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:09:35.376100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373403 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373406 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373408 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373412 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373416 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373419 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373422 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373425 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373428 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373431 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373434 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373437 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373440 2571 flags.go:64] FLAG: --lock-file="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373442 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373445 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373448 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373454 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373457 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373459 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373462 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373465 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373468 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373471 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373474 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373478 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:09:35.376706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373481 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373486 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373489 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373499 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373504 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373508 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373511 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373514 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373517 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373525 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373528 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373531 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373536 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373539 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373544 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373547 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373550 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373553 2571 flags.go:64] FLAG: --port="10250" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373556 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373559 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c0d5e1e13bbae846" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373563 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373566 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373574 2571 flags.go:64] FLAG: --register-node="true" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373578 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:09:35.377323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373580 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373584 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373587 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373590 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373593 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373597 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373600 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373603 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373606 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373609 2571 flags.go:64] FLAG: --runonce="false" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373612 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373616 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373618 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373621 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373624 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373627 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373630 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373634 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373636 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373639 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373642 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373646 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373649 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373652 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373655 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:09:35.377936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373660 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373663 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373665 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373669 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373672 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373676 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373679 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373682 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373699 2571 flags.go:64] FLAG: --v="2" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373706 2571 flags.go:64] FLAG: --version="false" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373711 2571 flags.go:64] FLAG: --vmodule="" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373720 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.373724 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373818 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373822 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373826 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373830 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373832 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373835 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373838 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373840 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373843 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373846 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:35.378530 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373848 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373850 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373853 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373856 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373858 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373861 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373868 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373870 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373873 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373875 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373878 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373881 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373883 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373886 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373890 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373892 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373895 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373897 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373900 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373903 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:35.379119 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373905 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373908 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373910 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373913 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373916 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373920 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373923 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373925 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373928 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373931 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373933 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373937 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373939 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373943 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373946 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373949 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373952 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373955 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373957 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:35.380049 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373960 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373963 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373965 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373968 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373970 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373973 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373976 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373979 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373981 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373985 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373988 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373991 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373994 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.373997 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374000 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374002 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374005 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374008 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374011 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374014 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:35.380890 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374016 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374019 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374022 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374024 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374027 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374029 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374032 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374034 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374037 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374039 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374042 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374044 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374047 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374049 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374052 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374054 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:35.381856 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.374057 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.374065 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.381277 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.381302 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381377 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381386 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381391 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381396 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381403 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381411 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381418 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381424 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381428 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381433 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381437 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:35.382626 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381444 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381450 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381455 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381459 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381464 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381469 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381473 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381478 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381492 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381506 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381510 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381515 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381519 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381524 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381528 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381533 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381538 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381542 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381546 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381550 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:35.383046 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381556 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381561 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381566 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381570 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381574 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381578 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381583 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381587 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381592 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381596 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381600 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381604 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381608 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381613 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381617 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381621 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381625 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381630 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381634 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381638 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:35.383621 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381642 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381646 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381651 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381655 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381659 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381663 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381667 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381671 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381676 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381680 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381701 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381705 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381710 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381716 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381720 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381723 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381727 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381730 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381734 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381738 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:35.384352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381741 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381745 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381748 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381752 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381756 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381759 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381763 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381767 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381770 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381774 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381778 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381782 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381786 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381790 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381794 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:35.385196 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.381802 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381971 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381979 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381984 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381989 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381994 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.381999 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382003 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382007 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382012 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382017 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382022 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382027 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382031 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382036 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382040 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382044 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382048 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382052 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382057 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:35.385712 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382061 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382065 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382069 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382073 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382078 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382081 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382085 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382090 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382094 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382098 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382102 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382106 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382110 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382115 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382119 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382123 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382128 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382133 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382137 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382141 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:35.386335 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382145 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382150 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382155 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382159 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382165 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382170 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382174 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382178 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382182 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382186 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382190 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382194 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382198 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382202 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382206 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382210 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382215 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382219 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382223 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382226 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:35.386899 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382231 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382235 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382239 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382243 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382247 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382251 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382255 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382260 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382264 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382269 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382273 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382278 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382282 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382286 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382290 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382295 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382299 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382303 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382307 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:35.387401 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382311 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382318 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382323 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382328 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382334 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382339 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382343 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:35.382347 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.382354 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.383133 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.385829 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.386716 2571 server.go:1019] "Starting client certificate rotation" Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.386814 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:35.387997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.386854 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:35.406504 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.406475 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:35.408507 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.408482 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:35.424091 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.424066 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:09:35.430400 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.430379 2571 log.go:25] "Validated CRI v1 image API" Apr 16 18:09:35.431716 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.431699 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:09:35.435610 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.435581 2571 fs.go:135] Filesystem UUIDs: map[23549cb2-dce2-4f05-91e1-5f2a4059cfba:/dev/nvme0n1p3 6addc2ca-6b83-4d5f-9858-3930e02fae88:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:09:35.435716 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.435607 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:09:35.437930 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.437908 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:35.441255 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.441138 2571 manager.go:217] Machine: {Timestamp:2026-04-16 18:09:35.439490583 +0000 UTC m=+0.364041229 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105935 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24f83522cb9c72eacbbf9975a94fe9 SystemUUID:ec24f835-22cb-9c72-eacb-bf9975a94fe9 BootID:6e95a25f-8f22-49ed-b5f3-633f0bdf7097 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fc:b3:1a:4b:97 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fc:b3:1a:4b:97 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:c2:c8:3f:d0:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:09:35.441255 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.441244 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:09:35.441414 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.441357 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:09:35.442388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.442361 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:09:35.442569 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.442389 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-228.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:09:35.442655 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.442584 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:09:35.442655 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.442598 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:09:35.442655 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.442616 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:35.444048 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.444034 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:35.445713 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.445700 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:35.446022 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.446010 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:09:35.447941 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.447929 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:09:35.448006 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.447954 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:09:35.448006 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.447974 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:09:35.448006 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.447988 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:09:35.448006 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.448001 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:09:35.449025 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.449012 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:35.449086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.449040 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:35.452112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.452087 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:09:35.453973 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.453960 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:09:35.455371 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455349 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:09:35.455439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455382 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:09:35.455439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455394 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:09:35.455439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455409 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:09:35.455439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455433 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455446 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455460 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455472 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455484 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455502 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455519 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:09:35.455564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.455536 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:09:35.457246 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.457235 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:09:35.457288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.457247 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:09:35.460715 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.460681 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-228.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:09:35.461107 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.461094 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:09:35.461147 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.461129 2571 server.go:1295] "Started kubelet" Apr 16 18:09:35.461259 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.461231 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:09:35.461372 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.461353 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:09:35.461419 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.461355 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-228.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:09:35.461477 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.461405 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:09:35.461477 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.461460 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:09:35.462023 ip-10-0-142-228 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:09:35.464898 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.464639 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:09:35.466172 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.466154 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:09:35.468877 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.468127 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-228.ec2.internal.18a6e8b6f083e211 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-228.ec2.internal,UID:ip-10-0-142-228.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-228.ec2.internal,},FirstTimestamp:2026-04-16 18:09:35.461106193 +0000 UTC m=+0.385656834,LastTimestamp:2026-04-16 18:09:35.461106193 +0000 UTC m=+0.385656834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-228.ec2.internal,}" Apr 16 18:09:35.470526 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.470509 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:09:35.471934 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.471918 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:35.472484 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.472466 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:09:35.473091 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.473070 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.473208 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473192 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:09:35.473208 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473191 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:09:35.473352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473219 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:09:35.473352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473301 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:09:35.473352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473311 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:09:35.473475 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473411 2571 factory.go:153] Registering CRI-O factory Apr 16 18:09:35.473475 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473436 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 18:09:35.473573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473501 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:09:35.473573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473514 2571 factory.go:55] Registering systemd factory Apr 16 18:09:35.473573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473521 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:09:35.473573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473542 2571 factory.go:103] Registering Raw factory Apr 16 18:09:35.473573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473557 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 18:09:35.473949 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.473935 2571 manager.go:319] Starting recovery of all containers Apr 16 18:09:35.475729 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.475700 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:09:35.475845 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.475821 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-228.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:09:35.483254 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.483110 2571 manager.go:324] Recovery completed Apr 16 18:09:35.487483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.487428 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c54tm" Apr 16 18:09:35.488462 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.488450 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:35.491035 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.491019 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:35.491107 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.491050 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:35.491107 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.491060 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:35.491531 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.491516 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:09:35.491577 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.491532 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:09:35.491577 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.491550 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:35.492971 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.492898 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-228.ec2.internal.18a6e8b6f24c8b30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-228.ec2.internal,UID:ip-10-0-142-228.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-228.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-228.ec2.internal,},FirstTimestamp:2026-04-16 18:09:35.491033904 +0000 UTC m=+0.415584546,LastTimestamp:2026-04-16 18:09:35.491033904 +0000 UTC m=+0.415584546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-228.ec2.internal,}" Apr 16 18:09:35.494251 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.494235 2571 policy_none.go:49] "None policy: Start" Apr 16 18:09:35.494251 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.494254 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:09:35.494344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.494264 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:09:35.496608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.496583 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c54tm" Apr 16 18:09:35.532293 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532275 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.532316 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532327 2571 server.go:85] "Starting device plugin registration server" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532588 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532601 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532745 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532834 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.532842 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.533452 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:09:35.544144 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.533491 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.632896 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.632832 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:35.633786 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.633761 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:35.633892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.633796 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:35.633892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.633809 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:35.633892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.633835 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.634567 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.634547 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:09:35.635806 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.635786 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:09:35.635898 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.635817 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:09:35.635898 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.635835 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:09:35.635898 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.635843 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:09:35.635898 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.635878 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:09:35.639618 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.639597 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:35.640642 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.640628 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.640733 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.640649 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-228.ec2.internal\": node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.660610 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.660586 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.736565 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.736508 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal"] Apr 16 18:09:35.736766 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.736615 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:35.738263 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.738246 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:35.738326 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.738277 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:35.738326 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.738289 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:35.739510 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.739499 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:35.739676 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.739663 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.739724 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.739705 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:35.740285 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.740269 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:35.740359 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.740299 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:35.740359 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.740310 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:35.740359 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.740270 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:35.740448 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.740374 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:35.740448 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.740388 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:35.742206 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.742190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.742270 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.742224 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:35.743069 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.743055 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:35.743150 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.743083 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:35.743150 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.743093 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:35.758230 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.758209 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-228.ec2.internal\" not found" node="ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.761366 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.761352 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.761451 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.761438 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-228.ec2.internal\" not found" node="ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.862129 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.862094 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.874419 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.874394 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/336c8c438166eaa9677e7c52ae568520-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal\" (UID: \"336c8c438166eaa9677e7c52ae568520\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.874513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.874423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/336c8c438166eaa9677e7c52ae568520-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal\" (UID: \"336c8c438166eaa9677e7c52ae568520\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.874513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.874442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f66de67f3831643d984f3539ec96bac5-config\") pod \"kube-apiserver-proxy-ip-10-0-142-228.ec2.internal\" (UID: \"f66de67f3831643d984f3539ec96bac5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.962823 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:35.962794 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:35.975186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.975159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f66de67f3831643d984f3539ec96bac5-config\") pod \"kube-apiserver-proxy-ip-10-0-142-228.ec2.internal\" (UID: \"f66de67f3831643d984f3539ec96bac5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.975280 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.975195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/336c8c438166eaa9677e7c52ae568520-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal\" (UID: \"336c8c438166eaa9677e7c52ae568520\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.975280 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.975221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/336c8c438166eaa9677e7c52ae568520-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal\" (UID: \"336c8c438166eaa9677e7c52ae568520\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.975280 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.975267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f66de67f3831643d984f3539ec96bac5-config\") pod \"kube-apiserver-proxy-ip-10-0-142-228.ec2.internal\" (UID: \"f66de67f3831643d984f3539ec96bac5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.975280 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.975277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/336c8c438166eaa9677e7c52ae568520-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal\" (UID: \"336c8c438166eaa9677e7c52ae568520\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:35.975434 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:35.975273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/336c8c438166eaa9677e7c52ae568520-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal\" (UID: \"336c8c438166eaa9677e7c52ae568520\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:36.062369 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.062327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:36.063407 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:36.063388 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:36.065079 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.065060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" Apr 16 18:09:36.164074 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:36.164037 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:36.264661 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:36.264589 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:36.365096 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:36.365061 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:36.386576 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.386553 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:09:36.386732 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.386701 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:09:36.431480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.431458 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:36.448718 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.448680 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:36.465318 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:36.465289 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:36.472994 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.472975 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:36.483931 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.483904 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:36.498781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.498730 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:04:35 +0000 UTC" deadline="2027-12-28 06:59:06.79187691 +0000 UTC" Apr 16 18:09:36.498781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.498780 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14892h49m30.293102031s" Apr 16 18:09:36.519076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.519007 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dpn5n" Apr 16 18:09:36.528557 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.528523 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dpn5n" Apr 16 18:09:36.556933 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:36.556904 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf66de67f3831643d984f3539ec96bac5.slice/crio-a719a311442fce07661efb5a5d51ec0c4946a2d5709b328039ba310affb36105 WatchSource:0}: Error finding container a719a311442fce07661efb5a5d51ec0c4946a2d5709b328039ba310affb36105: Status 404 returned error can't find the container with id a719a311442fce07661efb5a5d51ec0c4946a2d5709b328039ba310affb36105 Apr 16 18:09:36.560622 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.560607 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:09:36.566261 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:36.566243 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-228.ec2.internal\" not found" Apr 16 18:09:36.583015 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.582996 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:36.583116 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:36.583095 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod336c8c438166eaa9677e7c52ae568520.slice/crio-b4c2d68af059f291a65dd523a0f00a0679d85a95e6c752b7932adadb09682784 WatchSource:0}: Error finding container b4c2d68af059f291a65dd523a0f00a0679d85a95e6c752b7932adadb09682784: Status 404 returned error can't find the container with id b4c2d68af059f291a65dd523a0f00a0679d85a95e6c752b7932adadb09682784 Apr 16 18:09:36.638396 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.638343 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" event={"ID":"336c8c438166eaa9677e7c52ae568520","Type":"ContainerStarted","Data":"b4c2d68af059f291a65dd523a0f00a0679d85a95e6c752b7932adadb09682784"} Apr 16 18:09:36.639224 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.639203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" event={"ID":"f66de67f3831643d984f3539ec96bac5","Type":"ContainerStarted","Data":"a719a311442fce07661efb5a5d51ec0c4946a2d5709b328039ba310affb36105"} Apr 16 18:09:36.673354 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.673327 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" Apr 16 18:09:36.697733 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.697278 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:36.699015 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.698990 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" Apr 16 18:09:36.708198 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:36.708172 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:09:37.449657 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.449622 2571 apiserver.go:52] "Watching apiserver" Apr 16 18:09:37.455415 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.455386 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:09:37.455799 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.455776 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-54j8c","openshift-multus/multus-tvhn4","openshift-network-diagnostics/network-check-target-n5jhc","openshift-ovn-kubernetes/ovnkube-node-mj9tk","kube-system/konnectivity-agent-8jkg6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq","openshift-cluster-node-tuning-operator/tuned-lrhq4","openshift-multus/network-metrics-daemon-lnrzm","openshift-network-operator/iptables-alerter-ncmtf","kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal","openshift-dns/node-resolver-7ldnx","openshift-image-registry/node-ca-4crql","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal"] Apr 16 18:09:37.460434 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.460407 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.460434 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.460430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.462188 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.462164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:37.462290 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.462256 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:37.464093 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464069 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.464825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464310 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.464825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464578 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-svd72\"" Apr 16 18:09:37.464825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464722 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:09:37.464825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464758 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:09:37.464825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464800 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.464825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.465217 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.464921 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.465217 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.465174 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-srkb8\"" Apr 16 18:09:37.466122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466101 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.466529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kbf56\"" Apr 16 18:09:37.466529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466351 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:09:37.466529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466441 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:09:37.466777 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466751 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:09:37.466856 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466751 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.466856 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466823 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.467019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.466998 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:09:37.468417 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.468208 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.468484 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.468455 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.468484 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.468452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:09:37.468585 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.468571 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kt97x\"" Apr 16 18:09:37.470172 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.470150 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.470284 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.470263 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:37.470368 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.470345 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:37.473271 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.472485 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:09:37.473271 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.473141 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dlljj\"" Apr 16 18:09:37.473526 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.473501 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.474052 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.474030 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:09:37.475556 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.475534 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.475645 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.475575 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.475903 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.475888 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jphs5\"" Apr 16 18:09:37.476028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.476008 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.476342 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.476327 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:09:37.477969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.477953 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.478067 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.477995 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.478067 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.478025 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bqjb\"" Apr 16 18:09:37.478568 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.478229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.480214 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.480195 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mg4vw\"" Apr 16 18:09:37.480297 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.480260 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:09:37.480374 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.480304 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:09:37.480447 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.480381 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:09:37.480447 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.480400 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.482388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.482370 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:09:37.482494 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.482428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:09:37.482494 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.482437 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-95cd9\"" Apr 16 18:09:37.483784 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-host-slash\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.483899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-cnibin\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.483899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-conf-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.483899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/382c7696-64ec-4dbb-9432-e6ac1f3479d8-tmp-dir\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.483899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovnkube-config\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.483899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-iptables-alerter-script\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-lib-modules\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483950 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-cni-bin\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.483978 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484006 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2szf\" (UniqueName: \"kubernetes.io/projected/d7db8d32-9b9c-46c1-b746-8adefec111d6-kube-api-access-m2szf\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-system-cni-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-systemd-units\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-os-release\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cni-binary-copy\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/382c7696-64ec-4dbb-9432-e6ac1f3479d8-hosts-file\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-registration-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.484164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-sys-fs\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-run\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-socket-dir-parent\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-cni-multus\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484233 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-log-socket\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c627e69d-5828-401b-ad05-a17a07d351bf-cni-binary-copy\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7db8d32-9b9c-46c1-b746-8adefec111d6-tmp\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484355 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-node-log\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-var-lib-kubelet\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484385 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-tuned\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-slash\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484457 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-device-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-kubelet\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-var-lib-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484526 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-ovn\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.484584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484543 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovn-node-metrics-cert\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vprlq\" (UniqueName: \"kubernetes.io/projected/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-kube-api-access-vprlq\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484601 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-sys\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-host\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-systemd\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-env-overrides\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6xn5\" (UniqueName: \"kubernetes.io/projected/df565fbf-1e31-4d50-9c3f-fbc370ba976a-kube-api-access-b6xn5\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-modprobe-d\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysctl-conf\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-cni-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484854 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-hostroot\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-etc-kubernetes\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484904 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-kubelet\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:37.485384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysconfig\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.484989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-systemd\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c627e69d-5828-401b-ad05-a17a07d351bf-multus-daemon-config\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cpgq\" (UniqueName: \"kubernetes.io/projected/9d27531f-08c4-4c67-974c-31cacc77b8be-kube-api-access-9cpgq\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-etc-selinux\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485086 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-netns\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-run-netns\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cnibin\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4xz\" (UniqueName: \"kubernetes.io/projected/39a3c457-c5d2-4ba4-9e24-60e3f0195721-kube-api-access-8z4xz\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-kubernetes\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485213 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysctl-d\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bd4j\" (UniqueName: \"kubernetes.io/projected/c627e69d-5828-401b-ad05-a17a07d351bf-kube-api-access-5bd4j\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-etc-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-cni-netd\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.486080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovnkube-script-lib\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrt7\" (UniqueName: \"kubernetes.io/projected/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-kube-api-access-5xrt7\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485438 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-k8s-cni-cncf-io\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485459 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-multus-certs\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-system-cni-dir\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-socket-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485584 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-os-release\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485609 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-cni-bin\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.486797 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.485640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6bx\" (UniqueName: \"kubernetes.io/projected/382c7696-64ec-4dbb-9432-e6ac1f3479d8-kube-api-access-wd6bx\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.529503 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.529468 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:36 +0000 UTC" deadline="2027-12-09 03:22:02.293517173 +0000 UTC" Apr 16 18:09:37.529503 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.529503 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14433h12m24.764017492s" Apr 16 18:09:37.574803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.574774 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:09:37.588651 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-cni-netd\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.588651 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovnkube-script-lib\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.588893 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrt7\" (UniqueName: \"kubernetes.io/projected/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-kube-api-access-5xrt7\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.588893 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-k8s-cni-cncf-io\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.588893 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-multus-certs\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.588893 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-system-cni-dir\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588935 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-socket-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.588975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-os-release\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-cni-bin\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-system-cni-dir\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6bx\" (UniqueName: \"kubernetes.io/projected/382c7696-64ec-4dbb-9432-e6ac1f3479d8-kube-api-access-wd6bx\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.589076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-k8s-cni-cncf-io\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-host-slash\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-cnibin\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-multus-certs\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-conf-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/382c7696-64ec-4dbb-9432-e6ac1f3479d8-tmp-dir\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589181 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-os-release\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovnkube-config\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-cnibin\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589272 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-host-slash\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-conf-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-socket-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589344 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-cni-bin\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.589408 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-iptables-alerter-script\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-lib-modules\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-cni-bin\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2szf\" (UniqueName: \"kubernetes.io/projected/d7db8d32-9b9c-46c1-b746-8adefec111d6-kube-api-access-m2szf\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-system-cni-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-systemd-units\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-cni-netd\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-os-release\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589626 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-lib-modules\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/382c7696-64ec-4dbb-9432-e6ac1f3479d8-tmp-dir\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-systemd-units\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cni-binary-copy\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovnkube-config\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovnkube-script-lib\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-os-release\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/382c7696-64ec-4dbb-9432-e6ac1f3479d8-hosts-file\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.590070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.589862 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-system-cni-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.590737 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.590019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-cni-bin\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.590737 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.590405 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-registration-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.591037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-registration-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.591184 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/382c7696-64ec-4dbb-9432-e6ac1f3479d8-hosts-file\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.591594 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-sys-fs\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.591708 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.591708 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cni-binary-copy\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.591708 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591702 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-sys-fs\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-run\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591630 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-run\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-socket-dir-parent\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-cni-multus\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-log-socket\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-iptables-alerter-script\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c627e69d-5828-401b-ad05-a17a07d351bf-cni-binary-copy\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-socket-dir-parent\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.591852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591862 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-cni-multus\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb99ad62-9922-4bfa-94da-001321cb977d-host\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7db8d32-9b9c-46c1-b746-8adefec111d6-tmp\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-log-socket\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-node-log\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.591977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-var-lib-kubelet\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-tuned\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-slash\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-device-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz49w\" (UniqueName: \"kubernetes.io/projected/bb99ad62-9922-4bfa-94da-001321cb977d-kube-api-access-tz49w\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-var-lib-kubelet\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d9b2f58f-9716-4752-b28b-793007f4eb48-konnectivity-ca\") pod \"konnectivity-agent-8jkg6\" (UID: \"d9b2f58f-9716-4752-b28b-793007f4eb48\") " pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-kubelet\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592140 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-var-lib-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-ovn\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.592272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-node-log\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592200 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovn-node-metrics-cert\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d9b2f58f-9716-4752-b28b-793007f4eb48-agent-certs\") pod \"konnectivity-agent-8jkg6\" (UID: \"d9b2f58f-9716-4752-b28b-793007f4eb48\") " pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vprlq\" (UniqueName: \"kubernetes.io/projected/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-kube-api-access-vprlq\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592276 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-slash\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-sys\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-host\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c627e69d-5828-401b-ad05-a17a07d351bf-cni-binary-copy\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-var-lib-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-systemd\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592364 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-device-dir\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-var-lib-kubelet\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-sys\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-ovn\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-env-overrides\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592444 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:09:37.593117 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-run-systemd\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6xn5\" (UniqueName: \"kubernetes.io/projected/df565fbf-1e31-4d50-9c3f-fbc370ba976a-kube-api-access-b6xn5\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-modprobe-d\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysctl-conf\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-cni-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-host\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-hostroot\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-etc-kubernetes\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-kubelet\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-modprobe-d\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysconfig\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-env-overrides\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592848 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-systemd\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysctl-conf\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c627e69d-5828-401b-ad05-a17a07d351bf-multus-daemon-config\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysconfig\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cpgq\" (UniqueName: \"kubernetes.io/projected/9d27531f-08c4-4c67-974c-31cacc77b8be-kube-api-access-9cpgq\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-etc-selinux\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.592973 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-multus-cni-dir\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.593043 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:09:38.093023844 +0000 UTC m=+3.017574494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-kubelet\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593123 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb99ad62-9922-4bfa-94da-001321cb977d-serviceca\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-systemd\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-netns\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-run-netns\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593203 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-host-run-netns\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cnibin\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-hostroot\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c627e69d-5828-401b-ad05-a17a07d351bf-multus-daemon-config\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.594926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.592747 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c627e69d-5828-401b-ad05-a17a07d351bf-etc-kubernetes\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4xz\" (UniqueName: \"kubernetes.io/projected/39a3c457-c5d2-4ba4-9e24-60e3f0195721-kube-api-access-8z4xz\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-cnibin\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-kubernetes\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593318 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/39a3c457-c5d2-4ba4-9e24-60e3f0195721-etc-selinux\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-kubernetes\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593351 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysctl-d\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-run-netns\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bd4j\" (UniqueName: \"kubernetes.io/projected/c627e69d-5828-401b-ad05-a17a07d351bf-kube-api-access-5bd4j\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-etc-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-sysctl-d\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593539 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-etc-openvswitch\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.595755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.593859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df565fbf-1e31-4d50-9c3f-fbc370ba976a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.596452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.595959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7db8d32-9b9c-46c1-b746-8adefec111d6-etc-tuned\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.596452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.596066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-ovn-node-metrics-cert\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.596452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.596134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7db8d32-9b9c-46c1-b746-8adefec111d6-tmp\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.598711 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.598384 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:37.598711 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.598412 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:37.598711 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.598425 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:37.598711 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:37.598499 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:38.098479939 +0000 UTC m=+3.023030591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:37.598711 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.598623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrt7\" (UniqueName: \"kubernetes.io/projected/c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3-kube-api-access-5xrt7\") pod \"ovnkube-node-mj9tk\" (UID: \"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.598711 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.598647 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6bx\" (UniqueName: \"kubernetes.io/projected/382c7696-64ec-4dbb-9432-e6ac1f3479d8-kube-api-access-wd6bx\") pod \"node-resolver-7ldnx\" (UID: \"382c7696-64ec-4dbb-9432-e6ac1f3479d8\") " pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.600081 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.600054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2szf\" (UniqueName: \"kubernetes.io/projected/d7db8d32-9b9c-46c1-b746-8adefec111d6-kube-api-access-m2szf\") pod \"tuned-lrhq4\" (UID: \"d7db8d32-9b9c-46c1-b746-8adefec111d6\") " pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.601229 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.601207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cpgq\" (UniqueName: \"kubernetes.io/projected/9d27531f-08c4-4c67-974c-31cacc77b8be-kube-api-access-9cpgq\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:37.601877 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.601847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6xn5\" (UniqueName: \"kubernetes.io/projected/df565fbf-1e31-4d50-9c3f-fbc370ba976a-kube-api-access-b6xn5\") pod \"multus-additional-cni-plugins-54j8c\" (UID: \"df565fbf-1e31-4d50-9c3f-fbc370ba976a\") " pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.602067 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.602033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vprlq\" (UniqueName: \"kubernetes.io/projected/3d2ca872-f7ad-40cd-9877-7b3ba0974e87-kube-api-access-vprlq\") pod \"iptables-alerter-ncmtf\" (UID: \"3d2ca872-f7ad-40cd-9877-7b3ba0974e87\") " pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.606106 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.606068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bd4j\" (UniqueName: \"kubernetes.io/projected/c627e69d-5828-401b-ad05-a17a07d351bf-kube-api-access-5bd4j\") pod \"multus-tvhn4\" (UID: \"c627e69d-5828-401b-ad05-a17a07d351bf\") " pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.606106 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.606102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4xz\" (UniqueName: \"kubernetes.io/projected/39a3c457-c5d2-4ba4-9e24-60e3f0195721-kube-api-access-8z4xz\") pod \"aws-ebs-csi-driver-node-2q8nq\" (UID: \"39a3c457-c5d2-4ba4-9e24-60e3f0195721\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.694086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb99ad62-9922-4bfa-94da-001321cb977d-host\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.694086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz49w\" (UniqueName: \"kubernetes.io/projected/bb99ad62-9922-4bfa-94da-001321cb977d-kube-api-access-tz49w\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.694325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d9b2f58f-9716-4752-b28b-793007f4eb48-konnectivity-ca\") pod \"konnectivity-agent-8jkg6\" (UID: \"d9b2f58f-9716-4752-b28b-793007f4eb48\") " pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.694325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb99ad62-9922-4bfa-94da-001321cb977d-host\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.694325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694212 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d9b2f58f-9716-4752-b28b-793007f4eb48-agent-certs\") pod \"konnectivity-agent-8jkg6\" (UID: \"d9b2f58f-9716-4752-b28b-793007f4eb48\") " pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.694325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb99ad62-9922-4bfa-94da-001321cb977d-serviceca\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.694628 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.694609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb99ad62-9922-4bfa-94da-001321cb977d-serviceca\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.695153 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.695132 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d9b2f58f-9716-4752-b28b-793007f4eb48-konnectivity-ca\") pod \"konnectivity-agent-8jkg6\" (UID: \"d9b2f58f-9716-4752-b28b-793007f4eb48\") " pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.697137 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.697119 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d9b2f58f-9716-4752-b28b-793007f4eb48-agent-certs\") pod \"konnectivity-agent-8jkg6\" (UID: \"d9b2f58f-9716-4752-b28b-793007f4eb48\") " pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:37.702442 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.702396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz49w\" (UniqueName: \"kubernetes.io/projected/bb99ad62-9922-4bfa-94da-001321cb977d-kube-api-access-tz49w\") pod \"node-ca-4crql\" (UID: \"bb99ad62-9922-4bfa-94da-001321cb977d\") " pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.771419 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.771378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" Apr 16 18:09:37.780391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.780364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tvhn4" Apr 16 18:09:37.782844 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.782827 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:37.792466 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.792443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ncmtf" Apr 16 18:09:37.798294 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.798274 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:09:37.804866 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.804844 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" Apr 16 18:09:37.810464 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.810447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-54j8c" Apr 16 18:09:37.818018 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.817994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7ldnx" Apr 16 18:09:37.825554 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.825533 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4crql" Apr 16 18:09:37.831174 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:37.831156 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:38.097168 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.097081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:38.097312 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:38.097223 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:38.097312 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:38.097299 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:09:39.097276388 +0000 UTC m=+4.021827018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:38.197904 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.197878 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:38.198053 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:38.197986 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:38.198053 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:38.198000 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:38.198053 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:38.198009 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:38.198053 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:38.198052 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:39.198039776 +0000 UTC m=+4.122590405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:38.262858 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:38.262826 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc627e69d_5828_401b_ad05_a17a07d351bf.slice/crio-466972d75963d42c1355bd3eaa680f04655820ad74532ddfa48280faa04bc67c WatchSource:0}: Error finding container 466972d75963d42c1355bd3eaa680f04655820ad74532ddfa48280faa04bc67c: Status 404 returned error can't find the container with id 466972d75963d42c1355bd3eaa680f04655820ad74532ddfa48280faa04bc67c Apr 16 18:09:38.264158 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:38.264063 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b82b47_ada7_4f38_9b8a_d7aa9bdb6be3.slice/crio-bc361a80ac00480fda6ab9fd65681406f00c7a1b14157c922cb6430942059a32 WatchSource:0}: Error finding container bc361a80ac00480fda6ab9fd65681406f00c7a1b14157c922cb6430942059a32: Status 404 returned error can't find the container with id bc361a80ac00480fda6ab9fd65681406f00c7a1b14157c922cb6430942059a32 Apr 16 18:09:38.265352 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:38.265302 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39a3c457_c5d2_4ba4_9e24_60e3f0195721.slice/crio-fe79c7d72336a4aa9db31f8bdc351a0b61ab05391f8fb8111bb9226587f6fa1e WatchSource:0}: Error finding container fe79c7d72336a4aa9db31f8bdc351a0b61ab05391f8fb8111bb9226587f6fa1e: Status 404 returned error can't find the container with id fe79c7d72336a4aa9db31f8bdc351a0b61ab05391f8fb8111bb9226587f6fa1e Apr 16 18:09:38.268059 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:38.267913 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf565fbf_1e31_4d50_9c3f_fbc370ba976a.slice/crio-f6cfd4626190a8ee9004be1ce22c8be59366a2d4c2576112a2f2faf9521e65d2 WatchSource:0}: Error finding container f6cfd4626190a8ee9004be1ce22c8be59366a2d4c2576112a2f2faf9521e65d2: Status 404 returned error can't find the container with id f6cfd4626190a8ee9004be1ce22c8be59366a2d4c2576112a2f2faf9521e65d2 Apr 16 18:09:38.269947 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:38.269915 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2ca872_f7ad_40cd_9877_7b3ba0974e87.slice/crio-c58b38dd2b8edf01aebeecee39456d3c1deabd31e13c1293092008bcc1d5c28c WatchSource:0}: Error finding container c58b38dd2b8edf01aebeecee39456d3c1deabd31e13c1293092008bcc1d5c28c: Status 404 returned error can't find the container with id c58b38dd2b8edf01aebeecee39456d3c1deabd31e13c1293092008bcc1d5c28c Apr 16 18:09:38.271542 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:09:38.270644 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382c7696_64ec_4dbb_9432_e6ac1f3479d8.slice/crio-70f250523b066e59c12c80cf0e34c1e15275466d4c745b4a14f976ad67243f02 WatchSource:0}: Error finding container 70f250523b066e59c12c80cf0e34c1e15275466d4c745b4a14f976ad67243f02: Status 404 returned error can't find the container with id 70f250523b066e59c12c80cf0e34c1e15275466d4c745b4a14f976ad67243f02 Apr 16 18:09:38.530267 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.530075 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:04:36 +0000 UTC" deadline="2028-02-01 11:06:47.593130772 +0000 UTC" Apr 16 18:09:38.530267 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.530260 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15736h57m9.062873279s" Apr 16 18:09:38.644891 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.644810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" event={"ID":"f66de67f3831643d984f3539ec96bac5","Type":"ContainerStarted","Data":"0fd5b7e373c69d9a75edecfbb7feeaad71888dc428f641d4992c0d88c0044e52"} Apr 16 18:09:38.648096 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.648022 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4crql" event={"ID":"bb99ad62-9922-4bfa-94da-001321cb977d","Type":"ContainerStarted","Data":"d1b72890773fc542419e713085bff122ef6095b45913bec88d44fd05e3fc5164"} Apr 16 18:09:38.662171 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.662098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerStarted","Data":"f6cfd4626190a8ee9004be1ce22c8be59366a2d4c2576112a2f2faf9521e65d2"} Apr 16 18:09:38.667720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.667657 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" event={"ID":"39a3c457-c5d2-4ba4-9e24-60e3f0195721","Type":"ContainerStarted","Data":"fe79c7d72336a4aa9db31f8bdc351a0b61ab05391f8fb8111bb9226587f6fa1e"} Apr 16 18:09:38.669911 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.669860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" event={"ID":"d7db8d32-9b9c-46c1-b746-8adefec111d6","Type":"ContainerStarted","Data":"25f888e47e218d7cff23854828cfc2b14944cd4e1550f6d079133bcc8c7b75a2"} Apr 16 18:09:38.672470 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.672425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8jkg6" event={"ID":"d9b2f58f-9716-4752-b28b-793007f4eb48","Type":"ContainerStarted","Data":"797105f7b6ac7eed3080ad2eac1a4851f005b1959436857fbe4cc9b2e1b34623"} Apr 16 18:09:38.674861 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.674839 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7ldnx" event={"ID":"382c7696-64ec-4dbb-9432-e6ac1f3479d8","Type":"ContainerStarted","Data":"70f250523b066e59c12c80cf0e34c1e15275466d4c745b4a14f976ad67243f02"} Apr 16 18:09:38.677904 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.677830 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ncmtf" event={"ID":"3d2ca872-f7ad-40cd-9877-7b3ba0974e87","Type":"ContainerStarted","Data":"c58b38dd2b8edf01aebeecee39456d3c1deabd31e13c1293092008bcc1d5c28c"} Apr 16 18:09:38.682525 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.682498 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"bc361a80ac00480fda6ab9fd65681406f00c7a1b14157c922cb6430942059a32"} Apr 16 18:09:38.688366 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:38.687904 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvhn4" event={"ID":"c627e69d-5828-401b-ad05-a17a07d351bf","Type":"ContainerStarted","Data":"466972d75963d42c1355bd3eaa680f04655820ad74532ddfa48280faa04bc67c"} Apr 16 18:09:39.105922 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.105329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:39.105922 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.105486 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:39.105922 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.105561 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:09:41.105541525 +0000 UTC m=+6.030092160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:39.206263 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.206184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:39.206410 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.206368 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:39.206410 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.206393 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:39.206410 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.206407 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:39.206568 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.206467 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:41.2064473 +0000 UTC m=+6.130997930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:39.637388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.636901 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:39.637388 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.637024 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:39.638314 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.638152 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:39.638314 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:39.638273 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:39.706606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.705968 2571 generic.go:358] "Generic (PLEG): container finished" podID="336c8c438166eaa9677e7c52ae568520" containerID="0a9be53589bc5d0b2cec17c0fe0c42403ccc708cf88f867c297b838bc31400ed" exitCode=0 Apr 16 18:09:39.706606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.706562 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" event={"ID":"336c8c438166eaa9677e7c52ae568520","Type":"ContainerDied","Data":"0a9be53589bc5d0b2cec17c0fe0c42403ccc708cf88f867c297b838bc31400ed"} Apr 16 18:09:39.721940 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:39.721885 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-228.ec2.internal" podStartSLOduration=3.721865014 podStartE2EDuration="3.721865014s" podCreationTimestamp="2026-04-16 18:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:38.666400639 +0000 UTC m=+3.590951287" watchObservedRunningTime="2026-04-16 18:09:39.721865014 +0000 UTC m=+4.646415666" Apr 16 18:09:40.731750 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:40.731015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" event={"ID":"336c8c438166eaa9677e7c52ae568520","Type":"ContainerStarted","Data":"52a7d4b398d76033f7c244b3af7b88e4a38ed7855eb0da2e30d68b144a547834"} Apr 16 18:09:41.122671 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.122586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:41.122839 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.122750 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:41.122917 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.122849 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:09:45.122827951 +0000 UTC m=+10.047378579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:41.127254 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.127207 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-228.ec2.internal" podStartSLOduration=5.127188158 podStartE2EDuration="5.127188158s" podCreationTimestamp="2026-04-16 18:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:09:40.74654356 +0000 UTC m=+5.671094212" watchObservedRunningTime="2026-04-16 18:09:41.127188158 +0000 UTC m=+6.051738811" Apr 16 18:09:41.128244 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.128223 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ksk5b"] Apr 16 18:09:41.131261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.131242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.131355 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.131321 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:41.223025 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.222989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bbf3171b-ab57-4ca5-93df-e38037360c5b-dbus\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.223211 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.223088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bbf3171b-ab57-4ca5-93df-e38037360c5b-kubelet-config\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.223211 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.223150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:41.223211 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.223181 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.223368 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.223337 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:41.223368 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.223361 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:41.223470 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.223377 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:41.223470 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.223434 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:45.223415493 +0000 UTC m=+10.147966137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:41.323591 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.323549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bbf3171b-ab57-4ca5-93df-e38037360c5b-kubelet-config\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.323792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.323630 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.323792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.323660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bbf3171b-ab57-4ca5-93df-e38037360c5b-dbus\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.323906 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.323860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bbf3171b-ab57-4ca5-93df-e38037360c5b-dbus\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.323957 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.323930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bbf3171b-ab57-4ca5-93df-e38037360c5b-kubelet-config\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.324051 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.324026 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:41.324185 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.324105 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret podName:bbf3171b-ab57-4ca5-93df-e38037360c5b nodeName:}" failed. No retries permitted until 2026-04-16 18:09:41.824086233 +0000 UTC m=+6.748636868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret") pod "global-pull-secret-syncer-ksk5b" (UID: "bbf3171b-ab57-4ca5-93df-e38037360c5b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:41.636698 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.636650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:41.636864 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.636811 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:41.637282 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.637262 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:41.637375 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.637352 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:41.829215 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:41.829138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:41.829660 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.829277 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:41.829660 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:41.829347 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret podName:bbf3171b-ab57-4ca5-93df-e38037360c5b nodeName:}" failed. No retries permitted until 2026-04-16 18:09:42.829328977 +0000 UTC m=+7.753879623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret") pod "global-pull-secret-syncer-ksk5b" (UID: "bbf3171b-ab57-4ca5-93df-e38037360c5b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:42.636764 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:42.636729 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:42.636934 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:42.636841 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:42.836495 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:42.836458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:42.836975 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:42.836639 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:42.836975 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:42.836717 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret podName:bbf3171b-ab57-4ca5-93df-e38037360c5b nodeName:}" failed. No retries permitted until 2026-04-16 18:09:44.836698003 +0000 UTC m=+9.761248646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret") pod "global-pull-secret-syncer-ksk5b" (UID: "bbf3171b-ab57-4ca5-93df-e38037360c5b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:43.636778 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:43.636516 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:43.636778 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:43.636542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:43.636778 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:43.636637 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:43.636778 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:43.636750 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:44.636933 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:44.636900 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:44.637397 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:44.637047 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:44.854083 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:44.854042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:44.854267 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:44.854198 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:44.854331 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:44.854273 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret podName:bbf3171b-ab57-4ca5-93df-e38037360c5b nodeName:}" failed. No retries permitted until 2026-04-16 18:09:48.854255636 +0000 UTC m=+13.778806280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret") pod "global-pull-secret-syncer-ksk5b" (UID: "bbf3171b-ab57-4ca5-93df-e38037360c5b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:45.157843 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:45.157801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:45.158047 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.157960 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:45.158047 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.158030 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:09:53.158009603 +0000 UTC m=+18.082560238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:45.258978 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:45.258937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:45.259163 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.259136 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:45.259163 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.259162 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:45.259275 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.259175 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:45.259275 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.259237 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:53.25921778 +0000 UTC m=+18.183768431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:45.637400 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:45.636895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:45.637400 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.637003 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:45.638131 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:45.637973 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:45.638131 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:45.638100 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:46.636520 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:46.636482 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:46.636751 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:46.636612 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:47.636893 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:47.636806 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:47.637300 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:47.636929 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:47.637300 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:47.636958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:47.637300 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:47.637067 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:48.636301 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:48.636264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:48.636567 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:48.636421 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:48.885975 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:48.885929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:48.886462 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:48.886046 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:48.886462 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:48.886122 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret podName:bbf3171b-ab57-4ca5-93df-e38037360c5b nodeName:}" failed. No retries permitted until 2026-04-16 18:09:56.886102149 +0000 UTC m=+21.810652792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret") pod "global-pull-secret-syncer-ksk5b" (UID: "bbf3171b-ab57-4ca5-93df-e38037360c5b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:49.636888 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:49.636853 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:49.637078 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:49.636853 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:49.637078 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:49.636971 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:49.637078 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:49.637037 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:50.636228 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:50.636197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:50.636597 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:50.636297 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:51.636999 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:51.636963 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:51.637476 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:51.637014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:51.637476 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:51.637092 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:51.637476 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:51.637192 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:52.636254 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:52.636216 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:52.636425 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:52.636332 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:53.222237 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:53.222192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:53.222710 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.222349 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:53.222710 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.222426 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.222400993 +0000 UTC m=+34.146951648 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:09:53.323352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:53.323313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:53.323539 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.323513 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:09:53.323601 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.323592 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:09:53.323650 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.323607 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:53.323718 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.323673 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.323652808 +0000 UTC m=+34.248203444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:09:53.636939 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:53.636515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:53.636939 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:53.636519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:53.636939 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.636754 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:53.636939 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:53.636672 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:54.637035 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:54.636989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:54.637512 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:54.637136 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:55.638411 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.637636 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:55.638411 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.637733 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:55.638411 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:55.638187 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:55.638411 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:55.638232 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:55.756402 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.756364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" event={"ID":"d7db8d32-9b9c-46c1-b746-8adefec111d6","Type":"ContainerStarted","Data":"e0c58d837e3f3fb024ea0accaafefe7930fd7d0a5caa84a369fb4621ce1bfe49"} Apr 16 18:09:55.757823 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.757787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8jkg6" event={"ID":"d9b2f58f-9716-4752-b28b-793007f4eb48","Type":"ContainerStarted","Data":"cce4f252986da1c634fe84ffe4afd26e0ea3e511e36df9166db1f30f7c4c51fc"} Apr 16 18:09:55.759162 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.759132 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7ldnx" event={"ID":"382c7696-64ec-4dbb-9432-e6ac1f3479d8","Type":"ContainerStarted","Data":"68a926f951f5b8caf5f1b8cbce1445aa5f0f62334aab7100b9a0e8472e479002"} Apr 16 18:09:55.762037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.762007 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"c774651346a259d71d80c5ac80100e768106a992ebe016e57c68d974ed2c24fb"} Apr 16 18:09:55.762037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.762038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"2859d44344814ba8076389bd59018e7de4e9a479111fa71dbe752a37df80d58e"} Apr 16 18:09:55.762222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.762050 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"197601e2969332b81d7372d4002e00038e7fdfef33c41abcf9be315602727b8f"} Apr 16 18:09:55.762222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.762058 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"36b27fc85e088e8b3bcc97bf8d3e0ba017597c0e9165f68323bdb8319de9a914"} Apr 16 18:09:55.762222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.762066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"08d2b74a85fea1e0efd7263b73cd19371daa13a4f47d6701888040739f419d4f"} Apr 16 18:09:55.762222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.762074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"fa2d87e7fbb57b5975adc01f94d8463e0205f332d6868f452f81d952e4103848"} Apr 16 18:09:55.763265 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.763242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tvhn4" event={"ID":"c627e69d-5828-401b-ad05-a17a07d351bf","Type":"ContainerStarted","Data":"e18c15ad76a79cfb82f6bdce6e5a122fb88cb19d2c51f79e229a30b7b4bee070"} Apr 16 18:09:55.764531 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.764511 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4crql" event={"ID":"bb99ad62-9922-4bfa-94da-001321cb977d","Type":"ContainerStarted","Data":"633ad4df58144ff01751e42f554d074587d5b8c1bb1b0f3b3662ffa3ad41bae9"} Apr 16 18:09:55.766362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.766140 2571 generic.go:358] "Generic (PLEG): container finished" podID="df565fbf-1e31-4d50-9c3f-fbc370ba976a" containerID="e49809cfdfa1f32c2844ef61a3719ff1618105c84d4a62beb06ad8f421823ada" exitCode=0 Apr 16 18:09:55.766362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.766212 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerDied","Data":"e49809cfdfa1f32c2844ef61a3719ff1618105c84d4a62beb06ad8f421823ada"} Apr 16 18:09:55.767599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.767571 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" event={"ID":"39a3c457-c5d2-4ba4-9e24-60e3f0195721","Type":"ContainerStarted","Data":"d2ddfa51926849d540689430d5e58d67b94942fbe00ed296c924c7a03d482fa2"} Apr 16 18:09:55.772280 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.772227 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lrhq4" podStartSLOduration=4.104803148 podStartE2EDuration="20.772214708s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.278036528 +0000 UTC m=+3.202587161" lastFinishedPulling="2026-04-16 18:09:54.945448078 +0000 UTC m=+19.869998721" observedRunningTime="2026-04-16 18:09:55.771723405 +0000 UTC m=+20.696274059" watchObservedRunningTime="2026-04-16 18:09:55.772214708 +0000 UTC m=+20.696765358" Apr 16 18:09:55.786277 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.786228 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tvhn4" podStartSLOduration=4.068052819 podStartE2EDuration="20.786217967s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.264386302 +0000 UTC m=+3.188936935" lastFinishedPulling="2026-04-16 18:09:54.982551444 +0000 UTC m=+19.907102083" observedRunningTime="2026-04-16 18:09:55.785953639 +0000 UTC m=+20.710504332" watchObservedRunningTime="2026-04-16 18:09:55.786217967 +0000 UTC m=+20.710768616" Apr 16 18:09:55.799180 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.799133 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7ldnx" podStartSLOduration=4.13408135 podStartE2EDuration="20.799110801s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.275192947 +0000 UTC m=+3.199743579" lastFinishedPulling="2026-04-16 18:09:54.940222385 +0000 UTC m=+19.864773030" observedRunningTime="2026-04-16 18:09:55.798534281 +0000 UTC m=+20.723084945" watchObservedRunningTime="2026-04-16 18:09:55.799110801 +0000 UTC m=+20.723661453" Apr 16 18:09:55.854313 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.854256 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4crql" podStartSLOduration=4.191589388 podStartE2EDuration="20.854239874s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.27771924 +0000 UTC m=+3.202269869" lastFinishedPulling="2026-04-16 18:09:54.940369722 +0000 UTC m=+19.864920355" observedRunningTime="2026-04-16 18:09:55.829407521 +0000 UTC m=+20.753958173" watchObservedRunningTime="2026-04-16 18:09:55.854239874 +0000 UTC m=+20.778790503" Apr 16 18:09:55.876309 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:55.876256 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8jkg6" podStartSLOduration=3.211245025 podStartE2EDuration="19.876241206s" podCreationTimestamp="2026-04-16 18:09:36 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.275264101 +0000 UTC m=+3.199814734" lastFinishedPulling="2026-04-16 18:09:54.94026028 +0000 UTC m=+19.864810915" observedRunningTime="2026-04-16 18:09:55.875748753 +0000 UTC m=+20.800299402" watchObservedRunningTime="2026-04-16 18:09:55.876241206 +0000 UTC m=+20.800791856" Apr 16 18:09:56.395344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.395313 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:09:56.546055 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.545920 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:09:56.395336938Z","UUID":"72df648d-dfc4-4f3f-a257-20f41fb4e559","Handler":null,"Name":"","Endpoint":""} Apr 16 18:09:56.548899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.548876 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:09:56.548899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.548904 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:09:56.636650 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.636616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:56.636824 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:56.636745 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:56.771428 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.771387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" event={"ID":"39a3c457-c5d2-4ba4-9e24-60e3f0195721","Type":"ContainerStarted","Data":"9798f4b371098503cee9760553ce4736162cae553e4aa432e720a89734c18695"} Apr 16 18:09:56.773013 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.772985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ncmtf" event={"ID":"3d2ca872-f7ad-40cd-9877-7b3ba0974e87","Type":"ContainerStarted","Data":"fb8fe25cf6761a52c900c40d88bfa6e0b4ac9c62c912b6ad416cf1ef31f5fd4d"} Apr 16 18:09:56.787858 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.787807 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ncmtf" podStartSLOduration=5.122800137 podStartE2EDuration="21.787790901s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.275282664 +0000 UTC m=+3.199833295" lastFinishedPulling="2026-04-16 18:09:54.940273426 +0000 UTC m=+19.864824059" observedRunningTime="2026-04-16 18:09:56.787005925 +0000 UTC m=+21.711556576" watchObservedRunningTime="2026-04-16 18:09:56.787790901 +0000 UTC m=+21.712341551" Apr 16 18:09:56.954576 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:56.954472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:56.954793 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:56.954591 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:56.954793 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:56.954704 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret podName:bbf3171b-ab57-4ca5-93df-e38037360c5b nodeName:}" failed. No retries permitted until 2026-04-16 18:10:12.954668236 +0000 UTC m=+37.879218865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret") pod "global-pull-secret-syncer-ksk5b" (UID: "bbf3171b-ab57-4ca5-93df-e38037360c5b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:09:57.639189 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:57.639165 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:57.639344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:57.639166 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:57.639344 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:57.639276 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:57.639428 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:57.639391 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:57.779154 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:57.779081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" event={"ID":"39a3c457-c5d2-4ba4-9e24-60e3f0195721","Type":"ContainerStarted","Data":"3a877328f1ebb888548b8b8af6334d23b03fe0703fb8afadef8de78c9f1462c5"} Apr 16 18:09:57.808658 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:57.808607 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2q8nq" podStartSLOduration=3.548059495 podStartE2EDuration="22.808588561s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.26732449 +0000 UTC m=+3.191875119" lastFinishedPulling="2026-04-16 18:09:57.527853556 +0000 UTC m=+22.452404185" observedRunningTime="2026-04-16 18:09:57.808156289 +0000 UTC m=+22.732706967" watchObservedRunningTime="2026-04-16 18:09:57.808588561 +0000 UTC m=+22.733139217" Apr 16 18:09:58.636639 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:58.636600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:09:58.636815 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:58.636750 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:09:58.784625 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:58.784587 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"ef2e7dbf6e8824633854d7a187a567ae24d4c9deea50a2199c5a09a2164715c0"} Apr 16 18:09:59.637048 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:59.636816 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:59.637048 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:59.636838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:09:59.637294 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:59.636838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:09:59.637347 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:59.637196 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:09:59.637402 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:09:59.637367 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:09:59.642452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:59.642422 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:59.786433 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:59.786400 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:09:59.786996 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:09:59.786976 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8jkg6" Apr 16 18:10:00.636585 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.636548 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:00.636781 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:00.636653 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:10:00.791532 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.791497 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" event={"ID":"c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3","Type":"ContainerStarted","Data":"c30ced2763ecccfdbb0148e47b926e6d2f41e2c86d3be414e4dda1c17a5c0113"} Apr 16 18:10:00.792400 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.791800 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:10:00.792400 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.791823 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:10:00.793170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.793143 2571 generic.go:358] "Generic (PLEG): container finished" podID="df565fbf-1e31-4d50-9c3f-fbc370ba976a" containerID="1dffd4103b10214003be6439161212c904009f33b0a141d471babf03ee438c92" exitCode=0 Apr 16 18:10:00.793287 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.793229 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerDied","Data":"1dffd4103b10214003be6439161212c904009f33b0a141d471babf03ee438c92"} Apr 16 18:10:00.807303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.807286 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:10:00.820391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:00.820355 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" podStartSLOduration=8.878840951 podStartE2EDuration="25.820343829s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.266431079 +0000 UTC m=+3.190981722" lastFinishedPulling="2026-04-16 18:09:55.207933956 +0000 UTC m=+20.132484600" observedRunningTime="2026-04-16 18:10:00.820120678 +0000 UTC m=+25.744671330" watchObservedRunningTime="2026-04-16 18:10:00.820343829 +0000 UTC m=+25.744894523" Apr 16 18:10:01.640371 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.640199 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:01.640501 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:01.640462 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:10:01.640501 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.640199 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:01.640610 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:01.640595 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:10:01.796549 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.796457 2571 generic.go:358] "Generic (PLEG): container finished" podID="df565fbf-1e31-4d50-9c3f-fbc370ba976a" containerID="cbe1572dbd1088751d207bdbafecf18906b92166fe7636f4b1a7815ac9bdb49b" exitCode=0 Apr 16 18:10:01.796938 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.796540 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerDied","Data":"cbe1572dbd1088751d207bdbafecf18906b92166fe7636f4b1a7815ac9bdb49b"} Apr 16 18:10:01.797350 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.797269 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:10:01.811674 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.811647 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:10:01.885742 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.885704 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ksk5b"] Apr 16 18:10:01.885883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.885860 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:01.885968 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:01.885951 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:10:01.888822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.888790 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lnrzm"] Apr 16 18:10:01.888969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.888891 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:01.889015 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:01.888996 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:10:01.891473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.891449 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n5jhc"] Apr 16 18:10:01.891591 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:01.891547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:01.891670 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:01.891647 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:10:02.800473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:02.800381 2571 generic.go:358] "Generic (PLEG): container finished" podID="df565fbf-1e31-4d50-9c3f-fbc370ba976a" containerID="131e4e617bd94f2a39142be0546ba152c3b97e565157435176a33e144d5bde0b" exitCode=0 Apr 16 18:10:02.800849 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:02.800471 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerDied","Data":"131e4e617bd94f2a39142be0546ba152c3b97e565157435176a33e144d5bde0b"} Apr 16 18:10:03.639638 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:03.639610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:03.639837 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:03.639609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:03.639837 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:03.639754 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:10:03.639837 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:03.639609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:03.639837 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:03.639804 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:10:03.640042 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:03.639862 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:10:05.640611 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:05.640582 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:05.641148 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:05.640582 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:05.641148 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:05.640700 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:10:05.641148 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:05.640586 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:05.641148 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:05.640797 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:10:05.641148 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:05.640885 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:10:07.637141 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.637115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:07.637598 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.637115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:07.637598 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:07.637217 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n5jhc" podUID="84581d08-c57a-48de-a2e5-e6f3f3c2e0b4" Apr 16 18:10:07.637598 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:07.637317 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:10:07.637598 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.637115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:07.637598 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:07.637441 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksk5b" podUID="bbf3171b-ab57-4ca5-93df-e38037360c5b" Apr 16 18:10:07.882371 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.882342 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-228.ec2.internal" event="NodeReady" Apr 16 18:10:07.882551 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.882505 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:07.919636 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.919600 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69f768f794-7jzj8"] Apr 16 18:10:07.928769 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.928733 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf"] Apr 16 18:10:07.928947 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.928906 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:07.930992 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.930973 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:10:07.930992 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.930979 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:10:07.931161 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.931016 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tt7tt\"" Apr 16 18:10:07.931351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.931308 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:10:07.937482 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.937210 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:10:07.937604 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.937534 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69f768f794-7jzj8"] Apr 16 18:10:07.937604 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.937583 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf"] Apr 16 18:10:07.937733 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.937722 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:07.939801 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.939730 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l42ss\"" Apr 16 18:10:07.939923 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.939824 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:10:07.939923 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.939872 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:10:07.945022 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.944998 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vj2"] Apr 16 18:10:07.950133 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.950111 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5hd6g"] Apr 16 18:10:07.950365 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.950275 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:07.952443 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.952423 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:07.952564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.952424 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:07.952782 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.952762 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ffhs9\"" Apr 16 18:10:07.953030 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.953013 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:07.956122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.956105 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:07.956563 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.956329 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vj2"] Apr 16 18:10:07.958354 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.958335 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:07.958490 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.958473 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bnqh9\"" Apr 16 18:10:07.959502 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.959483 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:07.971646 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:07.971590 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5hd6g"] Apr 16 18:10:08.042103 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-trusted-ca\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042103 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08bf3f64-b45f-4345-93cd-34468773149c-ca-trust-extracted\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-image-registry-private-configuration\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042183 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jqx\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-kube-api-access-j2jqx\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-bound-sa-token\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042229 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-installation-pull-secrets\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.042323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:08.042563 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5cbca814-7996-4973-bc09-b736c26d6348-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:08.042563 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.042385 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-registry-certificates\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143387 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143226 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-trusted-ca\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143387 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08bf3f64-b45f-4345-93cd-34468773149c-ca-trust-extracted\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143434 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-image-registry-private-configuration\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jqx\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-kube-api-access-j2jqx\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-bound-sa-token\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:08.143608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-installation-pull-secrets\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d07250da-0b72-4bc8-9129-e53b19a95890-config-volume\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08bf3f64-b45f-4345-93cd-34468773149c-ca-trust-extracted\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6sr\" (UniqueName: \"kubernetes.io/projected/991e5741-2829-429b-a2bd-759f5392a792-kube-api-access-bz6sr\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx2b8\" (UniqueName: \"kubernetes.io/projected/d07250da-0b72-4bc8-9129-e53b19a95890-kube-api-access-qx2b8\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.143786 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.143808 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143848 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5cbca814-7996-4973-bc09-b736c26d6348-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.143884 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:08.643862569 +0000 UTC m=+33.568413218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:08.143932 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.143923 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:08.144367 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.143993 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:08.643974068 +0000 UTC m=+33.568524702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:08.144367 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.143921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.144367 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.144237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07250da-0b72-4bc8-9129-e53b19a95890-tmp-dir\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.144367 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.144269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-trusted-ca\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.144367 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.144275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-registry-certificates\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.144594 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.144577 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5cbca814-7996-4973-bc09-b736c26d6348-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:08.144738 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.144719 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-registry-certificates\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.148230 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.148203 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-image-registry-private-configuration\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.148341 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.148214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-installation-pull-secrets\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.152730 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.152705 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-bound-sa-token\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.152821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.152744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jqx\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-kube-api-access-j2jqx\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.244855 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.244761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:08.244855 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.244799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d07250da-0b72-4bc8-9129-e53b19a95890-config-volume\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.244855 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.244824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6sr\" (UniqueName: \"kubernetes.io/projected/991e5741-2829-429b-a2bd-759f5392a792-kube-api-access-bz6sr\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:08.244855 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.244842 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx2b8\" (UniqueName: \"kubernetes.io/projected/d07250da-0b72-4bc8-9129-e53b19a95890-kube-api-access-qx2b8\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.245175 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.244894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.245175 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.244910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07250da-0b72-4bc8-9129-e53b19a95890-tmp-dir\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.245175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.244942 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:08.245175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.245026 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:08.745005842 +0000 UTC m=+33.669556488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:08.245175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.245032 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:08.245175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.245093 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:08.74507678 +0000 UTC m=+33.669627409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:08.245479 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.245240 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07250da-0b72-4bc8-9129-e53b19a95890-tmp-dir\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.245479 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.245426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d07250da-0b72-4bc8-9129-e53b19a95890-config-volume\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.253844 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.253813 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx2b8\" (UniqueName: \"kubernetes.io/projected/d07250da-0b72-4bc8-9129-e53b19a95890-kube-api-access-qx2b8\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.253988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.253970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6sr\" (UniqueName: \"kubernetes.io/projected/991e5741-2829-429b-a2bd-759f5392a792-kube-api-access-bz6sr\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:08.649008 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.648983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:08.649380 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.649047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:08.649380 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.649123 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:08.649380 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.649142 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:08.649380 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.649178 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:08.649380 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.649226 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.64920932 +0000 UTC m=+34.573759954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:08.649380 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.649239 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.649233051 +0000 UTC m=+34.573783680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:08.749955 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.749914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:08.750172 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.750010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:08.750172 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.750070 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:08.750172 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.750126 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.750112777 +0000 UTC m=+34.674663411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:08.750172 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.750160 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:08.750408 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:08.750223 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.750209768 +0000 UTC m=+34.674760397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:08.816435 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:08.816405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerStarted","Data":"55ff0a8834eb02460966709d7723c234ed569627c58a3f62472f2e200a073b17"} Apr 16 18:10:09.254982 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.254948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:09.255174 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.255073 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:09.255174 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.255130 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.255114138 +0000 UTC m=+66.179664769 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:09.356237 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.356197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:09.356396 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.356382 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:09.356436 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.356403 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:09.356436 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.356412 2571 projected.go:194] Error preparing data for projected volume kube-api-access-kqh72 for pod openshift-network-diagnostics/network-check-target-n5jhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:09.356502 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.356461 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72 podName:84581d08-c57a-48de-a2e5-e6f3f3c2e0b4 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:41.356447891 +0000 UTC m=+66.280998520 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqh72" (UniqueName: "kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72") pod "network-check-target-n5jhc" (UID: "84581d08-c57a-48de-a2e5-e6f3f3c2e0b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:09.636297 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.636255 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:09.636297 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.636280 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:09.636297 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.636299 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:09.639509 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.639487 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:09.639509 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.639503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kjmpp\"" Apr 16 18:10:09.639654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.639597 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:10:09.639654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.639503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:09.639736 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.639663 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:09.639769 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.639746 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bnhf5\"" Apr 16 18:10:09.659239 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.659221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:09.659572 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.659308 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:09.659572 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.659365 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:09.659572 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.659402 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:09.659572 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.659414 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:09.659572 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.659427 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.659409111 +0000 UTC m=+36.583959744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:09.659572 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.659447 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.659438397 +0000 UTC m=+36.583989038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:09.760186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.760155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:09.760348 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.760225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:09.760348 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.760303 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:09.760415 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.760362 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.760347691 +0000 UTC m=+36.684898319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:09.760415 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.760312 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:09.760481 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:09.760435 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:11.760423568 +0000 UTC m=+36.684974197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:09.820883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.820853 2571 generic.go:358] "Generic (PLEG): container finished" podID="df565fbf-1e31-4d50-9c3f-fbc370ba976a" containerID="55ff0a8834eb02460966709d7723c234ed569627c58a3f62472f2e200a073b17" exitCode=0 Apr 16 18:10:09.821009 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:09.820911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerDied","Data":"55ff0a8834eb02460966709d7723c234ed569627c58a3f62472f2e200a073b17"} Apr 16 18:10:10.826941 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:10.826908 2571 generic.go:358] "Generic (PLEG): container finished" podID="df565fbf-1e31-4d50-9c3f-fbc370ba976a" containerID="237d056234c015f013f710986257135e9548528f777020a7da978c247f7f9075" exitCode=0 Apr 16 18:10:10.827475 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:10.826985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerDied","Data":"237d056234c015f013f710986257135e9548528f777020a7da978c247f7f9075"} Apr 16 18:10:11.678026 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:11.677927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:11.678026 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:11.678018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:11.678251 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.678101 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:11.678251 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.678123 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:11.678251 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.678154 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:11.678251 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.678190 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.678171662 +0000 UTC m=+40.602722291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:11.678251 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.678209 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.678201112 +0000 UTC m=+40.602751741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:11.779086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:11.779051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:11.779262 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:11.779126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:11.779262 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.779194 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:11.779262 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.779227 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:11.779262 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.779253 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.779238733 +0000 UTC m=+40.703789362 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:11.779409 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:11.779266 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.779260125 +0000 UTC m=+40.703810754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:11.831832 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:11.831801 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-54j8c" event={"ID":"df565fbf-1e31-4d50-9c3f-fbc370ba976a","Type":"ContainerStarted","Data":"e2af420717bc280a68da9d97c195b6a5c2980599f9af4047cad548d4cbca5138"} Apr 16 18:10:11.859832 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:11.859759 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-54j8c" podStartSLOduration=6.599727978 podStartE2EDuration="36.859744963s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:09:38.269767807 +0000 UTC m=+3.194318438" lastFinishedPulling="2026-04-16 18:10:08.529784795 +0000 UTC m=+33.454335423" observedRunningTime="2026-04-16 18:10:11.858561063 +0000 UTC m=+36.783111713" watchObservedRunningTime="2026-04-16 18:10:11.859744963 +0000 UTC m=+36.784295592" Apr 16 18:10:12.988621 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:12.988583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:12.991167 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:12.991141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bbf3171b-ab57-4ca5-93df-e38037360c5b-original-pull-secret\") pod \"global-pull-secret-syncer-ksk5b\" (UID: \"bbf3171b-ab57-4ca5-93df-e38037360c5b\") " pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:13.246244 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:13.246154 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksk5b" Apr 16 18:10:13.383440 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:13.383273 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ksk5b"] Apr 16 18:10:13.386882 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:10:13.386844 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf3171b_ab57_4ca5_93df_e38037360c5b.slice/crio-50318d7c9299e8c266e3b24b265115fe7a5a9c58d5f3681c3817ea2644289a31 WatchSource:0}: Error finding container 50318d7c9299e8c266e3b24b265115fe7a5a9c58d5f3681c3817ea2644289a31: Status 404 returned error can't find the container with id 50318d7c9299e8c266e3b24b265115fe7a5a9c58d5f3681c3817ea2644289a31 Apr 16 18:10:13.837126 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:13.837090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ksk5b" event={"ID":"bbf3171b-ab57-4ca5-93df-e38037360c5b","Type":"ContainerStarted","Data":"50318d7c9299e8c266e3b24b265115fe7a5a9c58d5f3681c3817ea2644289a31"} Apr 16 18:10:15.710429 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:15.710387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:15.710909 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:15.710479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:15.710909 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.710530 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:15.710909 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.710555 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:15.710909 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.710625 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:23.71060771 +0000 UTC m=+48.635158371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:15.710909 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.710650 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:15.710909 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.710731 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:23.710714228 +0000 UTC m=+48.635264857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:15.810894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:15.810852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:15.811080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:15.810958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:15.811080 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.810999 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:15.811080 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.811067 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:23.811053228 +0000 UTC m=+48.735603861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:15.811239 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.811117 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:15.811239 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:15.811184 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:23.811166313 +0000 UTC m=+48.735717049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:17.846579 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:17.846541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ksk5b" event={"ID":"bbf3171b-ab57-4ca5-93df-e38037360c5b","Type":"ContainerStarted","Data":"598b5f858f25e5f13afd4d7f53d42434c9cc30790a394c6b0aaf64ad79667f9c"} Apr 16 18:10:17.866214 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:17.866153 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ksk5b" podStartSLOduration=32.995986836 podStartE2EDuration="36.866134559s" podCreationTimestamp="2026-04-16 18:09:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:13.388807555 +0000 UTC m=+38.313358199" lastFinishedPulling="2026-04-16 18:10:17.258955281 +0000 UTC m=+42.183505922" observedRunningTime="2026-04-16 18:10:17.865406985 +0000 UTC m=+42.789957662" watchObservedRunningTime="2026-04-16 18:10:17.866134559 +0000 UTC m=+42.790685209" Apr 16 18:10:23.773133 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:23.773086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:23.773505 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:23.773174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:23.773505 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.773233 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:23.773505 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.773268 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:23.773505 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.773279 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:23.773505 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.773302 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:39.77328623 +0000 UTC m=+64.697836858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:23.773505 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.773320 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:10:39.773308529 +0000 UTC m=+64.697859159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:23.873920 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:23.873879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:23.874106 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:23.873966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:23.874106 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.874022 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:23.874106 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.874084 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:39.874070373 +0000 UTC m=+64.798621001 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:23.874106 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.874093 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:23.874257 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:23.874137 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:39.874123432 +0000 UTC m=+64.798674061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:33.813348 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:33.813318 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mj9tk" Apr 16 18:10:39.791422 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:39.791383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:10:39.791830 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:39.791449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:10:39.791830 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.791555 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:10:39.791830 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.791540 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:39.791830 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.791584 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:10:39.791830 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.791620 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:11.791607036 +0000 UTC m=+96.716157665 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:10:39.791830 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.791634 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:11:11.791627239 +0000 UTC m=+96.716177868 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:10:39.892190 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:39.892165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:10:39.892353 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:39.892225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:10:39.892353 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.892313 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:39.892353 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.892314 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:39.892452 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.892359 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:11.8923464 +0000 UTC m=+96.816897029 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:10:39.892452 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:39.892370 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:11.892365131 +0000 UTC m=+96.816915760 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:10:41.303213 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.303179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:10:41.305601 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.305583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:41.313811 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:41.313790 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:10:41.313867 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:10:41.313857 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:11:45.313841032 +0000 UTC m=+130.238391660 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : secret "metrics-daemon-secret" not found Apr 16 18:10:41.404255 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.404220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:41.407013 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.406997 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:41.417121 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.417102 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:41.428972 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.428946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqh72\" (UniqueName: \"kubernetes.io/projected/84581d08-c57a-48de-a2e5-e6f3f3c2e0b4-kube-api-access-kqh72\") pod \"network-check-target-n5jhc\" (UID: \"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4\") " pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:41.454563 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.454541 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bnhf5\"" Apr 16 18:10:41.462545 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.462528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:41.594754 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.594718 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n5jhc"] Apr 16 18:10:41.598241 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:10:41.598206 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84581d08_c57a_48de_a2e5_e6f3f3c2e0b4.slice/crio-958a40b549cca3b012132d0a21fba7544fdeb77a015209e3e96018271f0c6478 WatchSource:0}: Error finding container 958a40b549cca3b012132d0a21fba7544fdeb77a015209e3e96018271f0c6478: Status 404 returned error can't find the container with id 958a40b549cca3b012132d0a21fba7544fdeb77a015209e3e96018271f0c6478 Apr 16 18:10:41.895146 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:41.895059 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n5jhc" event={"ID":"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4","Type":"ContainerStarted","Data":"958a40b549cca3b012132d0a21fba7544fdeb77a015209e3e96018271f0c6478"} Apr 16 18:10:44.902906 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:44.902873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n5jhc" event={"ID":"84581d08-c57a-48de-a2e5-e6f3f3c2e0b4","Type":"ContainerStarted","Data":"3d4553ffa928c0ffecc4efb91f5313fe7bd8489d7385dd0694bcdfa2ab83c6b7"} Apr 16 18:10:44.903295 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:44.903008 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:10:44.919049 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:10:44.919012 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n5jhc" podStartSLOduration=67.29406506 podStartE2EDuration="1m9.91900055s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:10:41.599959697 +0000 UTC m=+66.524510325" lastFinishedPulling="2026-04-16 18:10:44.224895183 +0000 UTC m=+69.149445815" observedRunningTime="2026-04-16 18:10:44.9186746 +0000 UTC m=+69.843225264" watchObservedRunningTime="2026-04-16 18:10:44.91900055 +0000 UTC m=+69.843551201" Apr 16 18:11:11.825509 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:11.825471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:11:11.825980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:11.825533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:11:11.825980 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.825633 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:11:11.825980 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.825650 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:11.825980 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.825669 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69f768f794-7jzj8: secret "image-registry-tls" not found Apr 16 18:11:11.825980 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.825716 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert podName:5cbca814-7996-4973-bc09-b736c26d6348 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.825699365 +0000 UTC m=+160.750250006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-wnnkf" (UID: "5cbca814-7996-4973-bc09-b736c26d6348") : secret "networking-console-plugin-cert" not found Apr 16 18:11:11.825980 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.825733 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls podName:08bf3f64-b45f-4345-93cd-34468773149c nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.825725924 +0000 UTC m=+160.750276553 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls") pod "image-registry-69f768f794-7jzj8" (UID: "08bf3f64-b45f-4345-93cd-34468773149c") : secret "image-registry-tls" not found Apr 16 18:11:11.926136 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:11.926093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:11:11.926301 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:11.926185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:11:11.926301 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.926235 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:11.926301 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.926295 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:11.926451 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.926299 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls podName:d07250da-0b72-4bc8-9129-e53b19a95890 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.92628625 +0000 UTC m=+160.850836879 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls") pod "dns-default-5hd6g" (UID: "d07250da-0b72-4bc8-9129-e53b19a95890") : secret "dns-default-metrics-tls" not found Apr 16 18:11:11.926451 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:11.926353 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert podName:991e5741-2829-429b-a2bd-759f5392a792 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.926338383 +0000 UTC m=+160.850889012 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert") pod "ingress-canary-n9vj2" (UID: "991e5741-2829-429b-a2bd-759f5392a792") : secret "canary-serving-cert" not found Apr 16 18:11:15.907892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:15.907855 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n5jhc" Apr 16 18:11:45.358016 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:45.357976 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:11:45.358421 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:45.358129 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:11:45.358421 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:45.358191 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs podName:9d27531f-08c4-4c67-974c-31cacc77b8be nodeName:}" failed. No retries permitted until 2026-04-16 18:13:47.358175144 +0000 UTC m=+252.282725772 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs") pod "network-metrics-daemon-lnrzm" (UID: "9d27531f-08c4-4c67-974c-31cacc77b8be") : secret "metrics-daemon-secret" not found Apr 16 18:11:55.771611 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.771558 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr"] Apr 16 18:11:55.776958 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.776928 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7"] Apr 16 18:11:55.777097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.777079 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.779266 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.779238 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.779378 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.779267 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.779431 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.779389 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:11:55.779699 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.779662 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj"] Apr 16 18:11:55.779787 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.779773 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:55.780428 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.780402 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6hj2n\"" Apr 16 18:11:55.780428 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.780420 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:11:55.782279 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.782261 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-98zlj"] Apr 16 18:11:55.782401 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.782386 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" Apr 16 18:11:55.784383 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784360 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:11:55.784484 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784468 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.784560 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784543 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-8v65v\"" Apr 16 18:11:55.784664 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784565 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.784664 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784649 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9qfgv\"" Apr 16 18:11:55.784800 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784721 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.784878 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.784864 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.785019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.785005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.786944 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.786924 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.787049 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.786945 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.787049 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.786924 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:11:55.787049 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.787012 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-d42tl\"" Apr 16 18:11:55.787182 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.787047 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:11:55.790775 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.790758 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj"] Apr 16 18:11:55.791959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.791940 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7"] Apr 16 18:11:55.793075 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.793046 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:11:55.801640 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.801617 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr"] Apr 16 18:11:55.808334 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.808311 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-98zlj"] Apr 16 18:11:55.836592 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.836795 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836603 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.836795 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-serving-cert\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.836795 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkf2\" (UniqueName: \"kubernetes.io/projected/48231118-0790-422a-b4db-213ba79fda5b-kube-api-access-2qkf2\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.836795 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-snapshots\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836804 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6nr\" (UniqueName: \"kubernetes.io/projected/ebd535ee-8abb-4929-ad09-ec940628fe6a-kube-api-access-6v6nr\") pod \"volume-data-source-validator-7d955d5dd4-wdmgj\" (UID: \"ebd535ee-8abb-4929-ad09-ec940628fe6a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/48231118-0790-422a-b4db-213ba79fda5b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-tmp\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836958 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwlnx\" (UniqueName: \"kubernetes.io/projected/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-kube-api-access-bwlnx\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.837019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.836994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2fpz\" (UniqueName: \"kubernetes.io/projected/9076fc18-c5fa-42c2-8798-729417891cb0-kube-api-access-w2fpz\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:55.881135 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.881102 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm"] Apr 16 18:11:55.884102 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.884086 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:55.886075 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.886051 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:11:55.886075 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.886064 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:11:55.886281 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.886265 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.886667 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.886649 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-lgw5t\"" Apr 16 18:11:55.886667 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.886658 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.898428 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.898403 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm"] Apr 16 18:11:55.938325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zp9h\" (UniqueName: \"kubernetes.io/projected/7cd24549-bac0-49c2-ab16-4e779bd2e01e-kube-api-access-7zp9h\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:55.938325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.938550 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.938550 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-serving-cert\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.938550 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:55.938442 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:55.938550 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:55.938503 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls podName:48231118-0790-422a-b4db-213ba79fda5b nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.438489644 +0000 UTC m=+141.363040273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-g75tr" (UID: "48231118-0790-422a-b4db-213ba79fda5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:55.938550 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkf2\" (UniqueName: \"kubernetes.io/projected/48231118-0790-422a-b4db-213ba79fda5b-kube-api-access-2qkf2\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.938859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938607 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd24549-bac0-49c2-ab16-4e779bd2e01e-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:55.938859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-snapshots\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.938859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6nr\" (UniqueName: \"kubernetes.io/projected/ebd535ee-8abb-4929-ad09-ec940628fe6a-kube-api-access-6v6nr\") pod \"volume-data-source-validator-7d955d5dd4-wdmgj\" (UID: \"ebd535ee-8abb-4929-ad09-ec940628fe6a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" Apr 16 18:11:55.938859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:55.938859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/48231118-0790-422a-b4db-213ba79fda5b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.938859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd24549-bac0-49c2-ab16-4e779bd2e01e-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:55.939157 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:55.938889 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:55.939157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.938933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-tmp\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.939157 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:55.938964 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls podName:9076fc18-c5fa-42c2-8798-729417891cb0 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.438943634 +0000 UTC m=+141.363494268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls") pod "cluster-samples-operator-667775844f-hrxm7" (UID: "9076fc18-c5fa-42c2-8798-729417891cb0") : secret "samples-operator-tls" not found Apr 16 18:11:55.939157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.939157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwlnx\" (UniqueName: \"kubernetes.io/projected/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-kube-api-access-bwlnx\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.939157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2fpz\" (UniqueName: \"kubernetes.io/projected/9076fc18-c5fa-42c2-8798-729417891cb0-kube-api-access-w2fpz\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:55.939424 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-tmp\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.939424 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939300 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-snapshots\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.939488 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939437 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.939542 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/48231118-0790-422a-b4db-213ba79fda5b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.939594 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.939580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.940841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.940821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-serving-cert\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.948121 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.948095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkf2\" (UniqueName: \"kubernetes.io/projected/48231118-0790-422a-b4db-213ba79fda5b-kube-api-access-2qkf2\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:55.948302 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.948212 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6nr\" (UniqueName: \"kubernetes.io/projected/ebd535ee-8abb-4929-ad09-ec940628fe6a-kube-api-access-6v6nr\") pod \"volume-data-source-validator-7d955d5dd4-wdmgj\" (UID: \"ebd535ee-8abb-4929-ad09-ec940628fe6a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" Apr 16 18:11:55.948780 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.948760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwlnx\" (UniqueName: \"kubernetes.io/projected/60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3-kube-api-access-bwlnx\") pod \"insights-operator-5785d4fcdd-98zlj\" (UID: \"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3\") " pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:55.949320 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.949297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2fpz\" (UniqueName: \"kubernetes.io/projected/9076fc18-c5fa-42c2-8798-729417891cb0-kube-api-access-w2fpz\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:55.984564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.984519 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-8459479994-94kcv"] Apr 16 18:11:55.989678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.989660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:55.994645 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.994621 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:11:55.994858 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.994712 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:11:55.994858 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.994772 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-f79vk\"" Apr 16 18:11:55.995051 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.995033 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:11:55.995107 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.995021 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:11:55.995107 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.995045 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:11:55.995364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:55.995350 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:11:56.003384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.003359 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8459479994-94kcv"] Apr 16 18:11:56.039802 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.039735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd24549-bac0-49c2-ab16-4e779bd2e01e-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.039802 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.039778 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-default-certificate\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.039802 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.039800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.040097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.039931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6t4\" (UniqueName: \"kubernetes.io/projected/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-kube-api-access-6d6t4\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.040097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.039957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zp9h\" (UniqueName: \"kubernetes.io/projected/7cd24549-bac0-49c2-ab16-4e779bd2e01e-kube-api-access-7zp9h\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.040097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.040004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.040097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.040034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd24549-bac0-49c2-ab16-4e779bd2e01e-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.040097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.040075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-stats-auth\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.040375 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.040353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd24549-bac0-49c2-ab16-4e779bd2e01e-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.042163 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.042146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd24549-bac0-49c2-ab16-4e779bd2e01e-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.049212 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.049191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zp9h\" (UniqueName: \"kubernetes.io/projected/7cd24549-bac0-49c2-ab16-4e779bd2e01e-kube-api-access-7zp9h\") pod \"kube-storage-version-migrator-operator-756bb7d76f-xwqsm\" (UID: \"7cd24549-bac0-49c2-ab16-4e779bd2e01e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.103428 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.103397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" Apr 16 18:11:56.108193 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.108165 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" Apr 16 18:11:56.141712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.141415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-default-certificate\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.141712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.141474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.141712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.141537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6t4\" (UniqueName: \"kubernetes.io/projected/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-kube-api-access-6d6t4\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.141712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.141618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.141712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.141713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-stats-auth\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.142109 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.141849 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:56.142109 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.141870 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.641847768 +0000 UTC m=+141.566398420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:56.142109 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.141919 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:56.641902721 +0000 UTC m=+141.566453355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : secret "router-metrics-certs-default" not found Apr 16 18:11:56.144191 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.144163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-default-certificate\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.144446 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.144425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-stats-auth\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.161415 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.161388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6t4\" (UniqueName: \"kubernetes.io/projected/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-kube-api-access-6d6t4\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.194076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.194042 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" Apr 16 18:11:56.230892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.230867 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-98zlj"] Apr 16 18:11:56.233227 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:11:56.233190 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c658cc_6c75_4ae1_a6a8_d29a55bfd5a3.slice/crio-940195c20cd1b9048af2a2b00980087d93b9f3dd7452fa56c21e90e6a5308306 WatchSource:0}: Error finding container 940195c20cd1b9048af2a2b00980087d93b9f3dd7452fa56c21e90e6a5308306: Status 404 returned error can't find the container with id 940195c20cd1b9048af2a2b00980087d93b9f3dd7452fa56c21e90e6a5308306 Apr 16 18:11:56.251610 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.251589 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj"] Apr 16 18:11:56.254345 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:11:56.254305 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebd535ee_8abb_4929_ad09_ec940628fe6a.slice/crio-288140fbb57e95b4e7fea20c107b60e0642dad2415fa3625f1e86b6a01655dce WatchSource:0}: Error finding container 288140fbb57e95b4e7fea20c107b60e0642dad2415fa3625f1e86b6a01655dce: Status 404 returned error can't find the container with id 288140fbb57e95b4e7fea20c107b60e0642dad2415fa3625f1e86b6a01655dce Apr 16 18:11:56.319043 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.318959 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm"] Apr 16 18:11:56.322811 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:11:56.322785 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cd24549_bac0_49c2_ab16_4e779bd2e01e.slice/crio-a0855460b6d67ebad93b3084fc757d0e43d3055b62fc1ed0070d8f8f5183d278 WatchSource:0}: Error finding container a0855460b6d67ebad93b3084fc757d0e43d3055b62fc1ed0070d8f8f5183d278: Status 404 returned error can't find the container with id a0855460b6d67ebad93b3084fc757d0e43d3055b62fc1ed0070d8f8f5183d278 Apr 16 18:11:56.443888 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.443858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:56.444075 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.443921 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:56.444075 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.444006 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:56.444075 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.444018 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:56.444075 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.444068 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls podName:9076fc18-c5fa-42c2-8798-729417891cb0 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.444053751 +0000 UTC m=+142.368604381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls") pod "cluster-samples-operator-667775844f-hrxm7" (UID: "9076fc18-c5fa-42c2-8798-729417891cb0") : secret "samples-operator-tls" not found Apr 16 18:11:56.444075 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.444080 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls podName:48231118-0790-422a-b4db-213ba79fda5b nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.444074657 +0000 UTC m=+142.368625286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-g75tr" (UID: "48231118-0790-422a-b4db-213ba79fda5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:56.646350 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.646268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.646496 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:56.646353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:56.646496 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.646430 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.64641291 +0000 UTC m=+142.570963539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:56.646569 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.646436 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:56.646569 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:56.646555 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.646537806 +0000 UTC m=+142.571088435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : secret "router-metrics-certs-default" not found Apr 16 18:11:57.037778 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.037727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" event={"ID":"7cd24549-bac0-49c2-ab16-4e779bd2e01e","Type":"ContainerStarted","Data":"a0855460b6d67ebad93b3084fc757d0e43d3055b62fc1ed0070d8f8f5183d278"} Apr 16 18:11:57.039466 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.039436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" event={"ID":"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3","Type":"ContainerStarted","Data":"940195c20cd1b9048af2a2b00980087d93b9f3dd7452fa56c21e90e6a5308306"} Apr 16 18:11:57.040922 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.040883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" event={"ID":"ebd535ee-8abb-4929-ad09-ec940628fe6a","Type":"ContainerStarted","Data":"288140fbb57e95b4e7fea20c107b60e0642dad2415fa3625f1e86b6a01655dce"} Apr 16 18:11:57.454337 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.454289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:57.454566 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.454541 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:57.454708 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.454616 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls podName:9076fc18-c5fa-42c2-8798-729417891cb0 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:59.454596473 +0000 UTC m=+144.379147106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls") pod "cluster-samples-operator-667775844f-hrxm7" (UID: "9076fc18-c5fa-42c2-8798-729417891cb0") : secret "samples-operator-tls" not found Apr 16 18:11:57.454708 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.454611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:57.454849 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.454768 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:57.454849 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.454824 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls podName:48231118-0790-422a-b4db-213ba79fda5b nodeName:}" failed. No retries permitted until 2026-04-16 18:11:59.454807325 +0000 UTC m=+144.379357960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-g75tr" (UID: "48231118-0790-422a-b4db-213ba79fda5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:57.657005 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.656971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:57.657183 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.657164 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:59.657131048 +0000 UTC m=+144.581681699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:57.657262 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:57.657246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:57.657404 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.657383 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:57.657458 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:57.657446 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:11:59.657429923 +0000 UTC m=+144.581980561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : secret "router-metrics-certs-default" not found Apr 16 18:11:58.044256 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:58.044224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" event={"ID":"ebd535ee-8abb-4929-ad09-ec940628fe6a","Type":"ContainerStarted","Data":"6778ba819d869b3648ad236b89401bf576cecbda3a73393c1b55876c6f8ba265"} Apr 16 18:11:58.063493 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:58.063444 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wdmgj" podStartSLOduration=1.740447343 podStartE2EDuration="3.063429211s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:11:56.256270984 +0000 UTC m=+141.180821616" lastFinishedPulling="2026-04-16 18:11:57.579252842 +0000 UTC m=+142.503803484" observedRunningTime="2026-04-16 18:11:58.062934553 +0000 UTC m=+142.987485204" watchObservedRunningTime="2026-04-16 18:11:58.063429211 +0000 UTC m=+142.987979861" Apr 16 18:11:59.047655 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.047615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" event={"ID":"7cd24549-bac0-49c2-ab16-4e779bd2e01e","Type":"ContainerStarted","Data":"22975511551a798bcb74ce1c7a3270903051e604c4528d0a0b38227c1926cdd7"} Apr 16 18:11:59.049078 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.049046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" event={"ID":"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3","Type":"ContainerStarted","Data":"c8323de528ee08d5dd405b2e928d1741d17cb4e68c2da6fa1322e86a20e463e7"} Apr 16 18:11:59.072989 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.072929 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" podStartSLOduration=1.757720513 podStartE2EDuration="4.072914829s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:11:56.324583692 +0000 UTC m=+141.249134321" lastFinishedPulling="2026-04-16 18:11:58.639778 +0000 UTC m=+143.564328637" observedRunningTime="2026-04-16 18:11:59.072900988 +0000 UTC m=+143.997451638" watchObservedRunningTime="2026-04-16 18:11:59.072914829 +0000 UTC m=+143.997465483" Apr 16 18:11:59.116608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.116561 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" podStartSLOduration=1.7146571069999998 podStartE2EDuration="4.11654443s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:11:56.235571514 +0000 UTC m=+141.160122150" lastFinishedPulling="2026-04-16 18:11:58.637458831 +0000 UTC m=+143.562009473" observedRunningTime="2026-04-16 18:11:59.115167726 +0000 UTC m=+144.039718379" watchObservedRunningTime="2026-04-16 18:11:59.11654443 +0000 UTC m=+144.041095112" Apr 16 18:11:59.474573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.474527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:11:59.474790 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.474595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:11:59.474790 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.474682 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:59.474790 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.474772 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls podName:48231118-0790-422a-b4db-213ba79fda5b nodeName:}" failed. No retries permitted until 2026-04-16 18:12:03.474754546 +0000 UTC m=+148.399305179 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-g75tr" (UID: "48231118-0790-422a-b4db-213ba79fda5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:59.474968 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.474705 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:59.474968 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.474842 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls podName:9076fc18-c5fa-42c2-8798-729417891cb0 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:03.474827142 +0000 UTC m=+148.399377787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls") pod "cluster-samples-operator-667775844f-hrxm7" (UID: "9076fc18-c5fa-42c2-8798-729417891cb0") : secret "samples-operator-tls" not found Apr 16 18:11:59.675855 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.675816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:59.676051 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.675930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:11:59.676051 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.675959 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:11:59.676051 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.676025 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:12:03.676004969 +0000 UTC m=+148.600555597 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : secret "router-metrics-certs-default" not found Apr 16 18:11:59.676269 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:11:59.676088 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:12:03.676074672 +0000 UTC m=+148.600625320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : configmap references non-existent config key: service-ca.crt Apr 16 18:11:59.950127 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.950085 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8"] Apr 16 18:11:59.953380 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.953329 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" Apr 16 18:11:59.955888 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.955867 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:59.956651 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.956633 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gjdpr\"" Apr 16 18:11:59.956793 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.956636 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:11:59.962327 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:11:59.962302 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8"] Apr 16 18:12:00.079934 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:00.079897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wp26\" (UniqueName: \"kubernetes.io/projected/6f673f44-4374-41cb-8649-2d63280b1dcb-kube-api-access-5wp26\") pod \"migrator-64d4d94569-lpnp8\" (UID: \"6f673f44-4374-41cb-8649-2d63280b1dcb\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" Apr 16 18:12:00.181293 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:00.181257 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wp26\" (UniqueName: \"kubernetes.io/projected/6f673f44-4374-41cb-8649-2d63280b1dcb-kube-api-access-5wp26\") pod \"migrator-64d4d94569-lpnp8\" (UID: \"6f673f44-4374-41cb-8649-2d63280b1dcb\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" Apr 16 18:12:00.191369 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:00.191341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wp26\" (UniqueName: \"kubernetes.io/projected/6f673f44-4374-41cb-8649-2d63280b1dcb-kube-api-access-5wp26\") pod \"migrator-64d4d94569-lpnp8\" (UID: \"6f673f44-4374-41cb-8649-2d63280b1dcb\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" Apr 16 18:12:00.263568 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:00.263487 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" Apr 16 18:12:00.380752 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:00.380717 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8"] Apr 16 18:12:00.384255 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:00.384229 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f673f44_4374_41cb_8649_2d63280b1dcb.slice/crio-5ed2aa1d1116f7c386fb5c96296c5d054fc809c5ef8b146cd85ce8eecf566306 WatchSource:0}: Error finding container 5ed2aa1d1116f7c386fb5c96296c5d054fc809c5ef8b146cd85ce8eecf566306: Status 404 returned error can't find the container with id 5ed2aa1d1116f7c386fb5c96296c5d054fc809c5ef8b146cd85ce8eecf566306 Apr 16 18:12:01.053775 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:01.053741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" event={"ID":"6f673f44-4374-41cb-8649-2d63280b1dcb","Type":"ContainerStarted","Data":"5ed2aa1d1116f7c386fb5c96296c5d054fc809c5ef8b146cd85ce8eecf566306"} Apr 16 18:12:02.057738 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:02.057704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" event={"ID":"6f673f44-4374-41cb-8649-2d63280b1dcb","Type":"ContainerStarted","Data":"e6de7f0893a89cee4f83892fb74eac53586b2f12c2eedce906a926a849a1cd9d"} Apr 16 18:12:02.058115 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:02.057745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" event={"ID":"6f673f44-4374-41cb-8649-2d63280b1dcb","Type":"ContainerStarted","Data":"e6be6bf99cfd0b97335e96ae66e52f758ef187560af4079a4dcad8f0f6758ab2"} Apr 16 18:12:02.075856 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:02.075804 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-lpnp8" podStartSLOduration=2.127824234 podStartE2EDuration="3.075789764s" podCreationTimestamp="2026-04-16 18:11:59 +0000 UTC" firstStartedPulling="2026-04-16 18:12:00.386281463 +0000 UTC m=+145.310832092" lastFinishedPulling="2026-04-16 18:12:01.33424698 +0000 UTC m=+146.258797622" observedRunningTime="2026-04-16 18:12:02.074592892 +0000 UTC m=+146.999143542" watchObservedRunningTime="2026-04-16 18:12:02.075789764 +0000 UTC m=+147.000340414" Apr 16 18:12:02.662597 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:02.662568 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7ldnx_382c7696-64ec-4dbb-9432-e6ac1f3479d8/dns-node-resolver/0.log" Apr 16 18:12:03.510967 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:03.510922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:12:03.511464 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:03.511003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:12:03.511464 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.511129 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:12:03.511464 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.511149 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:12:03.511464 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.511217 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls podName:9076fc18-c5fa-42c2-8798-729417891cb0 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:11.511200574 +0000 UTC m=+156.435751203 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls") pod "cluster-samples-operator-667775844f-hrxm7" (UID: "9076fc18-c5fa-42c2-8798-729417891cb0") : secret "samples-operator-tls" not found Apr 16 18:12:03.511464 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.511232 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls podName:48231118-0790-422a-b4db-213ba79fda5b nodeName:}" failed. No retries permitted until 2026-04-16 18:12:11.51122588 +0000 UTC m=+156.435776508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-g75tr" (UID: "48231118-0790-422a-b4db-213ba79fda5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:12:03.662607 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:03.662578 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4crql_bb99ad62-9922-4bfa-94da-001321cb977d/node-ca/0.log" Apr 16 18:12:03.713165 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:03.713125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:03.713344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:03.713241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:03.713344 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.713270 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:12:03.713344 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.713329 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:12:11.713311783 +0000 UTC m=+156.637862411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : secret "router-metrics-certs-default" not found Apr 16 18:12:03.713467 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:03.713400 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle podName:1c2effa7-2ad2-44ba-aad9-85b432e50f7e nodeName:}" failed. No retries permitted until 2026-04-16 18:12:11.713383832 +0000 UTC m=+156.637934466 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle") pod "router-default-8459479994-94kcv" (UID: "1c2effa7-2ad2-44ba-aad9-85b432e50f7e") : configmap references non-existent config key: service-ca.crt Apr 16 18:12:05.063724 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:05.063680 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-lpnp8_6f673f44-4374-41cb-8649-2d63280b1dcb/migrator/0.log" Apr 16 18:12:05.263722 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:05.263675 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-lpnp8_6f673f44-4374-41cb-8649-2d63280b1dcb/graceful-termination/0.log" Apr 16 18:12:05.464163 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:05.464133 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-xwqsm_7cd24549-bac0-49c2-ab16-4e779bd2e01e/kube-storage-version-migrator-operator/0.log" Apr 16 18:12:10.942701 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:10.942641 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" podUID="08bf3f64-b45f-4345-93cd-34468773149c" Apr 16 18:12:10.950814 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:10.950769 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" podUID="5cbca814-7996-4973-bc09-b736c26d6348" Apr 16 18:12:10.960949 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:10.960917 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-n9vj2" podUID="991e5741-2829-429b-a2bd-759f5392a792" Apr 16 18:12:10.981201 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:10.981163 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5hd6g" podUID="d07250da-0b72-4bc8-9129-e53b19a95890" Apr 16 18:12:11.078110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.078078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:12:11.078265 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.078078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5hd6g" Apr 16 18:12:11.078265 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.078079 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:12:11.581206 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.581163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:12:11.581383 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.581334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:12:11.581383 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:11.581333 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:12:11.581518 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:11.581455 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls podName:48231118-0790-422a-b4db-213ba79fda5b nodeName:}" failed. No retries permitted until 2026-04-16 18:12:27.581435899 +0000 UTC m=+172.505986541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-g75tr" (UID: "48231118-0790-422a-b4db-213ba79fda5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:12:11.583801 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.583771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076fc18-c5fa-42c2-8798-729417891cb0-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-hrxm7\" (UID: \"9076fc18-c5fa-42c2-8798-729417891cb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:12:11.695854 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.695824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" Apr 16 18:12:11.782147 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.782117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:11.782299 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.782287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:11.782922 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.782895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-service-ca-bundle\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:11.785156 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.785131 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c2effa7-2ad2-44ba-aad9-85b432e50f7e-metrics-certs\") pod \"router-default-8459479994-94kcv\" (UID: \"1c2effa7-2ad2-44ba-aad9-85b432e50f7e\") " pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:11.819292 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.819259 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7"] Apr 16 18:12:11.898669 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:11.898640 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:12.015587 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.015551 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8459479994-94kcv"] Apr 16 18:12:12.018462 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:12.018428 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2effa7_2ad2_44ba_aad9_85b432e50f7e.slice/crio-19c5b0439bba1e1bd121facf4bd7c5b36b53a262f21c3d72fc5a414610b71b2a WatchSource:0}: Error finding container 19c5b0439bba1e1bd121facf4bd7c5b36b53a262f21c3d72fc5a414610b71b2a: Status 404 returned error can't find the container with id 19c5b0439bba1e1bd121facf4bd7c5b36b53a262f21c3d72fc5a414610b71b2a Apr 16 18:12:12.082365 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.082326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8459479994-94kcv" event={"ID":"1c2effa7-2ad2-44ba-aad9-85b432e50f7e","Type":"ContainerStarted","Data":"8ee8ba9e872461b9bcceba468fbad5913b34525d3665cf1c59474dd2334ef20b"} Apr 16 18:12:12.082365 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.082369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8459479994-94kcv" event={"ID":"1c2effa7-2ad2-44ba-aad9-85b432e50f7e","Type":"ContainerStarted","Data":"19c5b0439bba1e1bd121facf4bd7c5b36b53a262f21c3d72fc5a414610b71b2a"} Apr 16 18:12:12.083518 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.083479 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" event={"ID":"9076fc18-c5fa-42c2-8798-729417891cb0","Type":"ContainerStarted","Data":"d0cac1a52b1dd3ebe2e35750d16eaaf0f8bde270f1d0b9d07c313211225e68b8"} Apr 16 18:12:12.107866 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.107810 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-8459479994-94kcv" podStartSLOduration=17.10779309 podStartE2EDuration="17.10779309s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:12.102634154 +0000 UTC m=+157.027184849" watchObservedRunningTime="2026-04-16 18:12:12.10779309 +0000 UTC m=+157.032343740" Apr 16 18:12:12.656845 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:12.656805 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lnrzm" podUID="9d27531f-08c4-4c67-974c-31cacc77b8be" Apr 16 18:12:12.899238 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.899203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:12.902005 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:12.901981 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:13.085969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:13.085921 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:13.087323 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:13.087300 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-8459479994-94kcv" Apr 16 18:12:14.089810 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:14.089778 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" event={"ID":"9076fc18-c5fa-42c2-8798-729417891cb0","Type":"ContainerStarted","Data":"e3499ef021a957908a2c8300f9bbc5a28bde8ab34786d905b5a953393a2a483d"} Apr 16 18:12:14.089810 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:14.089812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" event={"ID":"9076fc18-c5fa-42c2-8798-729417891cb0","Type":"ContainerStarted","Data":"e22155c09d7e2a7b5229040b867e4766f33782c297836de66247a3c2c6a3ed03"} Apr 16 18:12:14.109491 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:14.109433 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-hrxm7" podStartSLOduration=17.441922239 podStartE2EDuration="19.109415511s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:12:11.861379778 +0000 UTC m=+156.785930421" lastFinishedPulling="2026-04-16 18:12:13.528873046 +0000 UTC m=+158.453423693" observedRunningTime="2026-04-16 18:12:14.108515793 +0000 UTC m=+159.033066446" watchObservedRunningTime="2026-04-16 18:12:14.109415511 +0000 UTC m=+159.033966161" Apr 16 18:12:15.918334 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:15.918288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:12:15.918832 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:15.918354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:12:15.920602 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:15.920577 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"image-registry-69f768f794-7jzj8\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:12:15.920602 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:15.920597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cbca814-7996-4973-bc09-b736c26d6348-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-wnnkf\" (UID: \"5cbca814-7996-4973-bc09-b736c26d6348\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:12:16.019749 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.019708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:12:16.019919 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.019780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:12:16.022128 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.022101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07250da-0b72-4bc8-9129-e53b19a95890-metrics-tls\") pod \"dns-default-5hd6g\" (UID: \"d07250da-0b72-4bc8-9129-e53b19a95890\") " pod="openshift-dns/dns-default-5hd6g" Apr 16 18:12:16.022230 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.022160 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991e5741-2829-429b-a2bd-759f5392a792-cert\") pod \"ingress-canary-n9vj2\" (UID: \"991e5741-2829-429b-a2bd-759f5392a792\") " pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:12:16.182253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.182171 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bnqh9\"" Apr 16 18:12:16.182253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.182173 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tt7tt\"" Apr 16 18:12:16.182253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.182173 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ffhs9\"" Apr 16 18:12:16.189426 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.189404 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:12:16.189529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.189428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vj2" Apr 16 18:12:16.189529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.189502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5hd6g" Apr 16 18:12:16.352542 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.352264 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vj2"] Apr 16 18:12:16.354935 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:16.354898 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991e5741_2829_429b_a2bd_759f5392a792.slice/crio-501390372e21ae14efefc46eac97a764236508178a05bf58df6d7dc839f8838f WatchSource:0}: Error finding container 501390372e21ae14efefc46eac97a764236508178a05bf58df6d7dc839f8838f: Status 404 returned error can't find the container with id 501390372e21ae14efefc46eac97a764236508178a05bf58df6d7dc839f8838f Apr 16 18:12:16.571984 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.571955 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5hd6g"] Apr 16 18:12:16.574778 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:16.574752 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69f768f794-7jzj8"] Apr 16 18:12:16.575104 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:16.575074 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07250da_0b72_4bc8_9129_e53b19a95890.slice/crio-8e5d505b8aad1608b148d1d3dce7eebe93bb947fee119c767917d9f077362952 WatchSource:0}: Error finding container 8e5d505b8aad1608b148d1d3dce7eebe93bb947fee119c767917d9f077362952: Status 404 returned error can't find the container with id 8e5d505b8aad1608b148d1d3dce7eebe93bb947fee119c767917d9f077362952 Apr 16 18:12:16.578093 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:16.578070 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08bf3f64_b45f_4345_93cd_34468773149c.slice/crio-8511eb3fc3012afd582d612cfc70a1257910cebda717abc910bc6b7bfa82c9a5 WatchSource:0}: Error finding container 8511eb3fc3012afd582d612cfc70a1257910cebda717abc910bc6b7bfa82c9a5: Status 404 returned error can't find the container with id 8511eb3fc3012afd582d612cfc70a1257910cebda717abc910bc6b7bfa82c9a5 Apr 16 18:12:17.099277 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:17.099236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" event={"ID":"08bf3f64-b45f-4345-93cd-34468773149c","Type":"ContainerStarted","Data":"8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f"} Apr 16 18:12:17.099277 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:17.099284 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" event={"ID":"08bf3f64-b45f-4345-93cd-34468773149c","Type":"ContainerStarted","Data":"8511eb3fc3012afd582d612cfc70a1257910cebda717abc910bc6b7bfa82c9a5"} Apr 16 18:12:17.099742 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:17.099381 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:12:17.100962 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:17.100933 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n9vj2" event={"ID":"991e5741-2829-429b-a2bd-759f5392a792","Type":"ContainerStarted","Data":"501390372e21ae14efefc46eac97a764236508178a05bf58df6d7dc839f8838f"} Apr 16 18:12:17.102240 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:17.102215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5hd6g" event={"ID":"d07250da-0b72-4bc8-9129-e53b19a95890","Type":"ContainerStarted","Data":"8e5d505b8aad1608b148d1d3dce7eebe93bb947fee119c767917d9f077362952"} Apr 16 18:12:17.127422 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:17.127368 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" podStartSLOduration=161.127352748 podStartE2EDuration="2m41.127352748s" podCreationTimestamp="2026-04-16 18:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:17.127180981 +0000 UTC m=+162.051731632" watchObservedRunningTime="2026-04-16 18:12:17.127352748 +0000 UTC m=+162.051903398" Apr 16 18:12:19.108561 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:19.108527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n9vj2" event={"ID":"991e5741-2829-429b-a2bd-759f5392a792","Type":"ContainerStarted","Data":"3f32be6fadcd4582b9490c9d5741a26c282a32dbbd1ae027a3a40b4af0d618e8"} Apr 16 18:12:19.109905 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:19.109881 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5hd6g" event={"ID":"d07250da-0b72-4bc8-9129-e53b19a95890","Type":"ContainerStarted","Data":"fed0431af74aa2937693344051466153e11e579eb6e5df58372deea507aa9f70"} Apr 16 18:12:19.109905 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:19.109908 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5hd6g" event={"ID":"d07250da-0b72-4bc8-9129-e53b19a95890","Type":"ContainerStarted","Data":"88365d362ac72b433a14060735dba47807329fde61299f0bd3a51a8e779c8dbd"} Apr 16 18:12:19.110086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:19.109992 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5hd6g" Apr 16 18:12:19.124677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:19.124627 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n9vj2" podStartSLOduration=130.211106863 podStartE2EDuration="2m12.124614344s" podCreationTimestamp="2026-04-16 18:10:07 +0000 UTC" firstStartedPulling="2026-04-16 18:12:16.356865035 +0000 UTC m=+161.281415670" lastFinishedPulling="2026-04-16 18:12:18.270372519 +0000 UTC m=+163.194923151" observedRunningTime="2026-04-16 18:12:19.1238608 +0000 UTC m=+164.048411451" watchObservedRunningTime="2026-04-16 18:12:19.124614344 +0000 UTC m=+164.049164995" Apr 16 18:12:19.142150 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:19.142104 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5hd6g" podStartSLOduration=130.447240669 podStartE2EDuration="2m12.142089862s" podCreationTimestamp="2026-04-16 18:10:07 +0000 UTC" firstStartedPulling="2026-04-16 18:12:16.576992422 +0000 UTC m=+161.501543054" lastFinishedPulling="2026-04-16 18:12:18.271841615 +0000 UTC m=+163.196392247" observedRunningTime="2026-04-16 18:12:19.14060303 +0000 UTC m=+164.065153681" watchObservedRunningTime="2026-04-16 18:12:19.142089862 +0000 UTC m=+164.066640578" Apr 16 18:12:23.637081 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:23.637047 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:12:23.639549 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:23.639527 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l42ss\"" Apr 16 18:12:23.648270 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:23.648249 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" Apr 16 18:12:23.764198 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:23.764169 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf"] Apr 16 18:12:23.767990 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:23.767956 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cbca814_7996_4973_bc09_b736c26d6348.slice/crio-f69174857cfb67448185f6e924009b1c9ef8a6300913980ba1951bffa5b5065a WatchSource:0}: Error finding container f69174857cfb67448185f6e924009b1c9ef8a6300913980ba1951bffa5b5065a: Status 404 returned error can't find the container with id f69174857cfb67448185f6e924009b1c9ef8a6300913980ba1951bffa5b5065a Apr 16 18:12:24.124293 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:24.124255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" event={"ID":"5cbca814-7996-4973-bc09-b736c26d6348","Type":"ContainerStarted","Data":"f69174857cfb67448185f6e924009b1c9ef8a6300913980ba1951bffa5b5065a"} Apr 16 18:12:25.128481 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:25.128447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" event={"ID":"5cbca814-7996-4973-bc09-b736c26d6348","Type":"ContainerStarted","Data":"973c97e787017c2edfdab8a2accc6f93dff3c027ad52c1e86bb77a2ede79b3f8"} Apr 16 18:12:25.147822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:25.147777 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-wnnkf" podStartSLOduration=164.242245154 podStartE2EDuration="2m45.1477627s" podCreationTimestamp="2026-04-16 18:09:40 +0000 UTC" firstStartedPulling="2026-04-16 18:12:23.770230008 +0000 UTC m=+168.694780637" lastFinishedPulling="2026-04-16 18:12:24.675747539 +0000 UTC m=+169.600298183" observedRunningTime="2026-04-16 18:12:25.146897784 +0000 UTC m=+170.071448436" watchObservedRunningTime="2026-04-16 18:12:25.1477627 +0000 UTC m=+170.072313350" Apr 16 18:12:25.638697 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:25.638649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:12:26.462101 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.462068 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jnxv6"] Apr 16 18:12:26.465529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.465508 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.468295 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.468275 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9j6xp\"" Apr 16 18:12:26.469113 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.469095 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:12:26.469169 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.469116 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:12:26.474737 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.474717 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jnxv6"] Apr 16 18:12:26.602123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.602089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/70d13903-4740-4a09-aeb9-aec340552ebf-crio-socket\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.602123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.602125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/70d13903-4740-4a09-aeb9-aec340552ebf-data-volume\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.602358 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.602146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/70d13903-4740-4a09-aeb9-aec340552ebf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.602358 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.602165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69229\" (UniqueName: \"kubernetes.io/projected/70d13903-4740-4a09-aeb9-aec340552ebf-kube-api-access-69229\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.602358 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.602297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/70d13903-4740-4a09-aeb9-aec340552ebf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.702873 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.702831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/70d13903-4740-4a09-aeb9-aec340552ebf-crio-socket\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.702873 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.702868 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/70d13903-4740-4a09-aeb9-aec340552ebf-data-volume\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.703189 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.702886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/70d13903-4740-4a09-aeb9-aec340552ebf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.703189 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.702907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69229\" (UniqueName: \"kubernetes.io/projected/70d13903-4740-4a09-aeb9-aec340552ebf-kube-api-access-69229\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.703189 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.702964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/70d13903-4740-4a09-aeb9-aec340552ebf-crio-socket\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.703189 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.703055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/70d13903-4740-4a09-aeb9-aec340552ebf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.703362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.703260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/70d13903-4740-4a09-aeb9-aec340552ebf-data-volume\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.703507 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.703490 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/70d13903-4740-4a09-aeb9-aec340552ebf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.705277 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.705254 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/70d13903-4740-4a09-aeb9-aec340552ebf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.717375 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.717315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69229\" (UniqueName: \"kubernetes.io/projected/70d13903-4740-4a09-aeb9-aec340552ebf-kube-api-access-69229\") pod \"insights-runtime-extractor-jnxv6\" (UID: \"70d13903-4740-4a09-aeb9-aec340552ebf\") " pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.774155 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.774124 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jnxv6" Apr 16 18:12:26.915935 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:26.915899 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jnxv6"] Apr 16 18:12:26.919568 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:26.919539 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d13903_4740_4a09_aeb9_aec340552ebf.slice/crio-5ecd40023d02ed8f6c58ede82d259ad88ec49fe6254641083ed5f2b467573481 WatchSource:0}: Error finding container 5ecd40023d02ed8f6c58ede82d259ad88ec49fe6254641083ed5f2b467573481: Status 404 returned error can't find the container with id 5ecd40023d02ed8f6c58ede82d259ad88ec49fe6254641083ed5f2b467573481 Apr 16 18:12:27.135174 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:27.135140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jnxv6" event={"ID":"70d13903-4740-4a09-aeb9-aec340552ebf","Type":"ContainerStarted","Data":"f2bc4e4ca6a54cc9e6a6a34b1624cf0b1d813ba258a88bd092794f07fe9847c3"} Apr 16 18:12:27.135174 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:27.135175 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jnxv6" event={"ID":"70d13903-4740-4a09-aeb9-aec340552ebf","Type":"ContainerStarted","Data":"5ecd40023d02ed8f6c58ede82d259ad88ec49fe6254641083ed5f2b467573481"} Apr 16 18:12:27.610223 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:27.610184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:12:27.612772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:27.612745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/48231118-0790-422a-b4db-213ba79fda5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-g75tr\" (UID: \"48231118-0790-422a-b4db-213ba79fda5b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:12:27.886986 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:27.886903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" Apr 16 18:12:28.005960 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:28.005928 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr"] Apr 16 18:12:28.008934 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:28.008905 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48231118_0790_422a_b4db_213ba79fda5b.slice/crio-4974c4414d595c343693cb01a5478f39b4227dc7361f426596f631789311eb0e WatchSource:0}: Error finding container 4974c4414d595c343693cb01a5478f39b4227dc7361f426596f631789311eb0e: Status 404 returned error can't find the container with id 4974c4414d595c343693cb01a5478f39b4227dc7361f426596f631789311eb0e Apr 16 18:12:28.140194 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:28.140113 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jnxv6" event={"ID":"70d13903-4740-4a09-aeb9-aec340552ebf","Type":"ContainerStarted","Data":"4b81db3eabb831f53b0ad094fe57fe1ae125de8032b3682bc29fbe9241cf46c4"} Apr 16 18:12:28.141288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:28.141263 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" event={"ID":"48231118-0790-422a-b4db-213ba79fda5b","Type":"ContainerStarted","Data":"4974c4414d595c343693cb01a5478f39b4227dc7361f426596f631789311eb0e"} Apr 16 18:12:29.114943 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.114913 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5hd6g" Apr 16 18:12:29.461183 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.461146 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-784788794d-wmkr8"] Apr 16 18:12:29.464764 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.464737 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.467231 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.467186 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:12:29.467231 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.467224 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:12:29.467415 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.467264 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:12:29.467543 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.467528 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:12:29.468162 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.468144 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:12:29.468266 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.468175 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p8ftn\"" Apr 16 18:12:29.468266 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.468250 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:12:29.468382 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.468369 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:12:29.474299 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.474274 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-784788794d-wmkr8"] Apr 16 18:12:29.627639 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.627618 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-serving-cert\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.627742 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.627646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-oauth-config\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.627742 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.627727 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-oauth-serving-cert\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.627815 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.627747 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-config\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.627815 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.627773 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptkr\" (UniqueName: \"kubernetes.io/projected/8779f5f7-abfa-4e15-82cd-ede83b6785e9-kube-api-access-jptkr\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.627815 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.627790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-service-ca\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.728931 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.728849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-oauth-serving-cert\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729095 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.728963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-config\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729095 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jptkr\" (UniqueName: \"kubernetes.io/projected/8779f5f7-abfa-4e15-82cd-ede83b6785e9-kube-api-access-jptkr\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729095 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-service-ca\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729249 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-serving-cert\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729249 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-oauth-config\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729541 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-oauth-serving-cert\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729669 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729653 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-config\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.729907 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.729885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-service-ca\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.731782 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.731762 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-serving-cert\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.731892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.731825 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-oauth-config\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.739971 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.739933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptkr\" (UniqueName: \"kubernetes.io/projected/8779f5f7-abfa-4e15-82cd-ede83b6785e9-kube-api-access-jptkr\") pod \"console-784788794d-wmkr8\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.775787 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.775750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:29.907756 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:29.907734 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-784788794d-wmkr8"] Apr 16 18:12:29.910526 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:29.910497 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8779f5f7_abfa_4e15_82cd_ede83b6785e9.slice/crio-52142eb788b966850392242eb42715b2a1553059994a95b960941cabfb4b5e80 WatchSource:0}: Error finding container 52142eb788b966850392242eb42715b2a1553059994a95b960941cabfb4b5e80: Status 404 returned error can't find the container with id 52142eb788b966850392242eb42715b2a1553059994a95b960941cabfb4b5e80 Apr 16 18:12:30.149331 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:30.149246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jnxv6" event={"ID":"70d13903-4740-4a09-aeb9-aec340552ebf","Type":"ContainerStarted","Data":"3a4871393147dfc09e2dd1961717a8e179d0be52f811971e4b97c4b67e15ecd0"} Apr 16 18:12:30.150757 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:30.150724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" event={"ID":"48231118-0790-422a-b4db-213ba79fda5b","Type":"ContainerStarted","Data":"b066b977281f1b384e5e2674329e3fc50dd819306fa263e840d2b1b0f057323e"} Apr 16 18:12:30.151737 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:30.151718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-784788794d-wmkr8" event={"ID":"8779f5f7-abfa-4e15-82cd-ede83b6785e9","Type":"ContainerStarted","Data":"52142eb788b966850392242eb42715b2a1553059994a95b960941cabfb4b5e80"} Apr 16 18:12:30.169976 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:30.169931 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jnxv6" podStartSLOduration=2.055748594 podStartE2EDuration="4.169919451s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="2026-04-16 18:12:26.971386037 +0000 UTC m=+171.895936680" lastFinishedPulling="2026-04-16 18:12:29.085556905 +0000 UTC m=+174.010107537" observedRunningTime="2026-04-16 18:12:30.168961077 +0000 UTC m=+175.093511727" watchObservedRunningTime="2026-04-16 18:12:30.169919451 +0000 UTC m=+175.094470102" Apr 16 18:12:30.186981 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:30.186927 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" podStartSLOduration=33.580642981 podStartE2EDuration="35.186914396s" podCreationTimestamp="2026-04-16 18:11:55 +0000 UTC" firstStartedPulling="2026-04-16 18:12:28.01078912 +0000 UTC m=+172.935339753" lastFinishedPulling="2026-04-16 18:12:29.617060522 +0000 UTC m=+174.541611168" observedRunningTime="2026-04-16 18:12:30.186353046 +0000 UTC m=+175.110903735" watchObservedRunningTime="2026-04-16 18:12:30.186914396 +0000 UTC m=+175.111465046" Apr 16 18:12:33.161198 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:33.161111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-784788794d-wmkr8" event={"ID":"8779f5f7-abfa-4e15-82cd-ede83b6785e9","Type":"ContainerStarted","Data":"8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661"} Apr 16 18:12:33.180239 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:33.180188 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-784788794d-wmkr8" podStartSLOduration=1.201280342 podStartE2EDuration="4.180172734s" podCreationTimestamp="2026-04-16 18:12:29 +0000 UTC" firstStartedPulling="2026-04-16 18:12:29.912599658 +0000 UTC m=+174.837150290" lastFinishedPulling="2026-04-16 18:12:32.891492053 +0000 UTC m=+177.816042682" observedRunningTime="2026-04-16 18:12:33.179391951 +0000 UTC m=+178.103942602" watchObservedRunningTime="2026-04-16 18:12:33.180172734 +0000 UTC m=+178.104723385" Apr 16 18:12:36.193533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:36.193497 2571 patch_prober.go:28] interesting pod/image-registry-69f768f794-7jzj8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:36.193921 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:36.193551 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" podUID="08bf3f64-b45f-4345-93cd-34468773149c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:38.109094 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:38.109068 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:12:39.178222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.178197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:12:39.178583 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.178236 2571 generic.go:358] "Generic (PLEG): container finished" podID="48231118-0790-422a-b4db-213ba79fda5b" containerID="b066b977281f1b384e5e2674329e3fc50dd819306fa263e840d2b1b0f057323e" exitCode=2 Apr 16 18:12:39.178583 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.178298 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" event={"ID":"48231118-0790-422a-b4db-213ba79fda5b","Type":"ContainerDied","Data":"b066b977281f1b384e5e2674329e3fc50dd819306fa263e840d2b1b0f057323e"} Apr 16 18:12:39.178728 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.178607 2571 scope.go:117] "RemoveContainer" containerID="b066b977281f1b384e5e2674329e3fc50dd819306fa263e840d2b1b0f057323e" Apr 16 18:12:39.776771 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.776720 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:39.776771 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.776776 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:39.781578 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:39.781548 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:40.183029 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:40.182955 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:12:40.183443 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:40.183081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-g75tr" event={"ID":"48231118-0790-422a-b4db-213ba79fda5b","Type":"ContainerStarted","Data":"2147a2ce1a5837133edcd3d4d765776a348599a458a8f070345ef3f1df152186"} Apr 16 18:12:40.186990 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:40.186964 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:12:42.783285 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.783252 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn"] Apr 16 18:12:42.786844 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.786826 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:42.788902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.788882 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:12:42.789625 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.789606 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:12:42.789625 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.789617 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8vdmv\"" Apr 16 18:12:42.789772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.789618 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:12:42.798017 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.797993 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn"] Apr 16 18:12:42.930090 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.930056 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2d7kk"] Apr 16 18:12:42.933407 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.933384 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:42.935578 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.935553 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:12:42.935697 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.935600 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:12:42.935697 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.935634 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:12:42.935957 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.935934 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qbj8r\"" Apr 16 18:12:42.938156 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.938133 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:42.938288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.938175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:42.938288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.938205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6zc\" (UniqueName: \"kubernetes.io/projected/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-kube-api-access-5q6zc\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:42.938288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:42.938261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.039426 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039426 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2pb\" (UniqueName: \"kubernetes.io/projected/2b610fe3-ab36-4043-8263-fcb26b8dbd58-kube-api-access-jm2pb\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-wtmp\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-root\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039554 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-sys\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-tls\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-textfile\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039826 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.039826 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.039826 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-accelerators-collector-config\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.039826 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6zc\" (UniqueName: \"kubernetes.io/projected/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-kube-api-access-5q6zc\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.039826 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.040044 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.039827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b610fe3-ab36-4043-8263-fcb26b8dbd58-metrics-client-ca\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.040414 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.040394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.042076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.042057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.042243 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.042227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.048451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.048426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6zc\" (UniqueName: \"kubernetes.io/projected/4fb2e2d9-d9cf-4abc-94f7-195ef81bc483-kube-api-access-5q6zc\") pod \"openshift-state-metrics-5669946b84-h8cvn\" (UID: \"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.095552 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.095511 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" Apr 16 18:12:43.140355 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140355 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2pb\" (UniqueName: \"kubernetes.io/projected/2b610fe3-ab36-4043-8263-fcb26b8dbd58-kube-api-access-jm2pb\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-wtmp\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-root\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-root\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140542 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-sys\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-tls\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-wtmp\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b610fe3-ab36-4043-8263-fcb26b8dbd58-sys\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140963 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140641 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-textfile\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140963 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:43.140676 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:12:43.140963 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-accelerators-collector-config\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140963 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.140748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b610fe3-ab36-4043-8263-fcb26b8dbd58-metrics-client-ca\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.140963 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:43.140771 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-tls podName:2b610fe3-ab36-4043-8263-fcb26b8dbd58 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:43.640752971 +0000 UTC m=+188.565303605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-tls") pod "node-exporter-2d7kk" (UID: "2b610fe3-ab36-4043-8263-fcb26b8dbd58") : secret "node-exporter-tls" not found Apr 16 18:12:43.146073 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.141627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-textfile\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.146073 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.141937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b610fe3-ab36-4043-8263-fcb26b8dbd58-metrics-client-ca\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.146073 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.142041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-accelerators-collector-config\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.146073 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.143250 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.153583 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.153546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2pb\" (UniqueName: \"kubernetes.io/projected/2b610fe3-ab36-4043-8263-fcb26b8dbd58-kube-api-access-jm2pb\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.227019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.226974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn"] Apr 16 18:12:43.229658 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:43.229629 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb2e2d9_d9cf_4abc_94f7_195ef81bc483.slice/crio-4671ec42a1775dbd8aac2047dd0e9d0ccdde35970debe0c9b9d5d462d888f58e WatchSource:0}: Error finding container 4671ec42a1775dbd8aac2047dd0e9d0ccdde35970debe0c9b9d5d462d888f58e: Status 404 returned error can't find the container with id 4671ec42a1775dbd8aac2047dd0e9d0ccdde35970debe0c9b9d5d462d888f58e Apr 16 18:12:43.646311 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.646227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-tls\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.648728 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.648704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b610fe3-ab36-4043-8263-fcb26b8dbd58-node-exporter-tls\") pod \"node-exporter-2d7kk\" (UID: \"2b610fe3-ab36-4043-8263-fcb26b8dbd58\") " pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.842710 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:43.842661 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2d7kk" Apr 16 18:12:43.851662 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:43.851627 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b610fe3_ab36_4043_8263_fcb26b8dbd58.slice/crio-d3943718f03d7d30b54c642280605c9017495032b3a1aafbe57a781b6c45c101 WatchSource:0}: Error finding container d3943718f03d7d30b54c642280605c9017495032b3a1aafbe57a781b6c45c101: Status 404 returned error can't find the container with id d3943718f03d7d30b54c642280605c9017495032b3a1aafbe57a781b6c45c101 Apr 16 18:12:44.194669 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:44.194626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2d7kk" event={"ID":"2b610fe3-ab36-4043-8263-fcb26b8dbd58","Type":"ContainerStarted","Data":"d3943718f03d7d30b54c642280605c9017495032b3a1aafbe57a781b6c45c101"} Apr 16 18:12:44.196673 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:44.196642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" event={"ID":"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483","Type":"ContainerStarted","Data":"47403e69aba6ca69646ae280b1e01c0c82436c670328f3db274d6cd1bded17d6"} Apr 16 18:12:44.196822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:44.196681 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" event={"ID":"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483","Type":"ContainerStarted","Data":"21e6f9cd2af1ef54b5c8303e09ed4d6fe07e0283f26528310d565abe8adaa76a"} Apr 16 18:12:44.196822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:44.196714 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" event={"ID":"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483","Type":"ContainerStarted","Data":"4671ec42a1775dbd8aac2047dd0e9d0ccdde35970debe0c9b9d5d462d888f58e"} Apr 16 18:12:45.200968 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.200930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" event={"ID":"4fb2e2d9-d9cf-4abc-94f7-195ef81bc483","Type":"ContainerStarted","Data":"b8db226a90d7352e134b21d564032f231eff3a184bc76767ccb9926fc1bf9b94"} Apr 16 18:12:45.202448 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.202425 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b610fe3-ab36-4043-8263-fcb26b8dbd58" containerID="d0727feabde1c0481706251300d616c7ae6095588400abf6142f99114a8dfab5" exitCode=0 Apr 16 18:12:45.202568 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.202468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2d7kk" event={"ID":"2b610fe3-ab36-4043-8263-fcb26b8dbd58","Type":"ContainerDied","Data":"d0727feabde1c0481706251300d616c7ae6095588400abf6142f99114a8dfab5"} Apr 16 18:12:45.222283 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.222230 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-h8cvn" podStartSLOduration=2.230147061 podStartE2EDuration="3.222209791s" podCreationTimestamp="2026-04-16 18:12:42 +0000 UTC" firstStartedPulling="2026-04-16 18:12:43.357226041 +0000 UTC m=+188.281776670" lastFinishedPulling="2026-04-16 18:12:44.349288758 +0000 UTC m=+189.273839400" observedRunningTime="2026-04-16 18:12:45.221149457 +0000 UTC m=+190.145700138" watchObservedRunningTime="2026-04-16 18:12:45.222209791 +0000 UTC m=+190.146760443" Apr 16 18:12:45.953306 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.953270 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c4899b649-g46dq"] Apr 16 18:12:45.957096 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.957073 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.959410 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959382 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:12:45.959582 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959547 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:12:45.959706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959592 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pmfdb\"" Apr 16 18:12:45.959706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:12:45.959706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959644 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:12:45.959874 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:12:45.959874 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.959790 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7dlpob26mk0c1\"" Apr 16 18:12:45.966728 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxs7\" (UniqueName: \"kubernetes.io/projected/b942ff78-36e1-45e2-bf1f-bec607da9918-kube-api-access-prxs7\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.966842 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b942ff78-36e1-45e2-bf1f-bec607da9918-metrics-client-ca\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.966842 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966773 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.966842 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966796 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.966842 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.967054 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-grpc-tls\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.967054 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.966946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-tls\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.967054 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.967004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:45.967336 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:45.967321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c4899b649-g46dq"] Apr 16 18:12:46.068292 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068292 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-grpc-tls\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-tls\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prxs7\" (UniqueName: \"kubernetes.io/projected/b942ff78-36e1-45e2-bf1f-bec607da9918-kube-api-access-prxs7\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b942ff78-36e1-45e2-bf1f-bec607da9918-metrics-client-ca\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.068533 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.068512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.069374 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.069345 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b942ff78-36e1-45e2-bf1f-bec607da9918-metrics-client-ca\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.071447 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.071421 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.071566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.071480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-grpc-tls\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.071844 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.071823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.071944 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.071930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.072007 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.071987 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-tls\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.072084 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.072063 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b942ff78-36e1-45e2-bf1f-bec607da9918-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.076988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.076966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxs7\" (UniqueName: \"kubernetes.io/projected/b942ff78-36e1-45e2-bf1f-bec607da9918-kube-api-access-prxs7\") pod \"thanos-querier-5c4899b649-g46dq\" (UID: \"b942ff78-36e1-45e2-bf1f-bec607da9918\") " pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.207965 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.207864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2d7kk" event={"ID":"2b610fe3-ab36-4043-8263-fcb26b8dbd58","Type":"ContainerStarted","Data":"6a585cb53dec826afa5b110924905cb739fafb1b57f0f37d1d7aee347697099b"} Apr 16 18:12:46.207965 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.207915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2d7kk" event={"ID":"2b610fe3-ab36-4043-8263-fcb26b8dbd58","Type":"ContainerStarted","Data":"f8f94ea1ac727c02d1a5fe3a3a6a5d553a4037342ceef70d3d80eb3f08d1cb17"} Apr 16 18:12:46.229305 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.229257 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2d7kk" podStartSLOduration=3.344296552 podStartE2EDuration="4.229239158s" podCreationTimestamp="2026-04-16 18:12:42 +0000 UTC" firstStartedPulling="2026-04-16 18:12:43.85347534 +0000 UTC m=+188.778025970" lastFinishedPulling="2026-04-16 18:12:44.738417942 +0000 UTC m=+189.662968576" observedRunningTime="2026-04-16 18:12:46.228058122 +0000 UTC m=+191.152608772" watchObservedRunningTime="2026-04-16 18:12:46.229239158 +0000 UTC m=+191.153789809" Apr 16 18:12:46.267563 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.267534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:46.394045 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:46.394012 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c4899b649-g46dq"] Apr 16 18:12:46.397742 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:46.397714 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb942ff78_36e1_45e2_bf1f_bec607da9918.slice/crio-501a26e4707773e96cb6d7d9a8fd57a84eb35b15515f00ff0a4190c8b740988e WatchSource:0}: Error finding container 501a26e4707773e96cb6d7d9a8fd57a84eb35b15515f00ff0a4190c8b740988e: Status 404 returned error can't find the container with id 501a26e4707773e96cb6d7d9a8fd57a84eb35b15515f00ff0a4190c8b740988e Apr 16 18:12:47.212596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.212553 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"501a26e4707773e96cb6d7d9a8fd57a84eb35b15515f00ff0a4190c8b740988e"} Apr 16 18:12:47.684784 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.684679 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf"] Apr 16 18:12:47.688159 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.688134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:47.690238 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.690211 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:12:47.690342 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.690271 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-th4bn\"" Apr 16 18:12:47.698070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.698043 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf"] Apr 16 18:12:47.780914 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.780877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwbtf\" (UID: \"3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:47.882264 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:47.882227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwbtf\" (UID: \"3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:47.882470 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:47.882387 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:12:47.882540 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:12:47.882474 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a-monitoring-plugin-cert podName:3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a nodeName:}" failed. No retries permitted until 2026-04-16 18:12:48.382454586 +0000 UTC m=+193.307005227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-kwbtf" (UID: "3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a") : secret "monitoring-plugin-cert" not found Apr 16 18:12:48.283351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.283315 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86dcb9b989-jpc8v"] Apr 16 18:12:48.286725 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.286702 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.294220 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.294198 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:12:48.314118 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.314018 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86dcb9b989-jpc8v"] Apr 16 18:12:48.387219 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-config\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.387219 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-oauth-config\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.387333 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387220 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-serving-cert\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.387333 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-oauth-serving-cert\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.387333 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwbtf\" (UID: \"3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:48.387333 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kqmv\" (UniqueName: \"kubernetes.io/projected/5892585b-0a69-4e74-ac1c-a39d29ce132e-kube-api-access-9kqmv\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.387333 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-trusted-ca-bundle\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.387506 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.387345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-service-ca\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.389552 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.389521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwbtf\" (UID: \"3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:48.487832 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.487801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-config\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.487997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.487885 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-oauth-config\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.487997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.487914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-serving-cert\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.487997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.487939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-oauth-serving-cert\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.487997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.487971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kqmv\" (UniqueName: \"kubernetes.io/projected/5892585b-0a69-4e74-ac1c-a39d29ce132e-kube-api-access-9kqmv\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.487997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.488003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-trusted-ca-bundle\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.488376 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.488053 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-service-ca\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.488721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.488617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-config\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.488721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.488673 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-service-ca\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.489152 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.489125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-oauth-serving-cert\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.489483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.489463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-trusted-ca-bundle\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.490831 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.490810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-serving-cert\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.491284 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.491260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-oauth-config\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.505982 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.505959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kqmv\" (UniqueName: \"kubernetes.io/projected/5892585b-0a69-4e74-ac1c-a39d29ce132e-kube-api-access-9kqmv\") pod \"console-86dcb9b989-jpc8v\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.530204 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.530179 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69f768f794-7jzj8"] Apr 16 18:12:48.597220 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.597141 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:48.598754 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.598734 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:48.746741 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.746710 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf"] Apr 16 18:12:48.749235 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:48.749206 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db24e4b_40c0_4c3a_91d1_6c2cc0904f8a.slice/crio-284d1b5f47cd87f06be5fe30faf19fea801610a80220878c2085d9ebe01db5f5 WatchSource:0}: Error finding container 284d1b5f47cd87f06be5fe30faf19fea801610a80220878c2085d9ebe01db5f5: Status 404 returned error can't find the container with id 284d1b5f47cd87f06be5fe30faf19fea801610a80220878c2085d9ebe01db5f5 Apr 16 18:12:48.933243 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:48.933160 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86dcb9b989-jpc8v"] Apr 16 18:12:48.935909 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:12:48.935878 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5892585b_0a69_4e74_ac1c_a39d29ce132e.slice/crio-d4513fecc78c5eefa159721e6c474f1a75a93f4a16b5e84bb1f7b9fd84447b2b WatchSource:0}: Error finding container d4513fecc78c5eefa159721e6c474f1a75a93f4a16b5e84bb1f7b9fd84447b2b: Status 404 returned error can't find the container with id d4513fecc78c5eefa159721e6c474f1a75a93f4a16b5e84bb1f7b9fd84447b2b Apr 16 18:12:49.221138 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.221103 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"069bb5ae059c5bbfe3ede05a7df412358574cef7200c06d6cfe66e2a837d1fdf"} Apr 16 18:12:49.221138 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.221141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"2129e404a76d2be07e5205e4e73290b94d8626cdb7a4273693f98216b948e7f7"} Apr 16 18:12:49.221380 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.221156 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"51c5171ab29a65dd9ac4598d575ca061b3cb35a2484fc950cdc95bd6fb459585"} Apr 16 18:12:49.222129 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.222101 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" event={"ID":"3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a","Type":"ContainerStarted","Data":"284d1b5f47cd87f06be5fe30faf19fea801610a80220878c2085d9ebe01db5f5"} Apr 16 18:12:49.223371 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.223345 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86dcb9b989-jpc8v" event={"ID":"5892585b-0a69-4e74-ac1c-a39d29ce132e","Type":"ContainerStarted","Data":"77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd"} Apr 16 18:12:49.223477 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.223377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86dcb9b989-jpc8v" event={"ID":"5892585b-0a69-4e74-ac1c-a39d29ce132e","Type":"ContainerStarted","Data":"d4513fecc78c5eefa159721e6c474f1a75a93f4a16b5e84bb1f7b9fd84447b2b"} Apr 16 18:12:49.240947 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:49.240906 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86dcb9b989-jpc8v" podStartSLOduration=1.240893937 podStartE2EDuration="1.240893937s" podCreationTimestamp="2026-04-16 18:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:49.240846514 +0000 UTC m=+194.165397178" watchObservedRunningTime="2026-04-16 18:12:49.240893937 +0000 UTC m=+194.165444588" Apr 16 18:12:50.227379 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.227344 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" event={"ID":"3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a","Type":"ContainerStarted","Data":"cc9d7732429909d44c71880120415af16f64ae800881aa0267c2576f9720a617"} Apr 16 18:12:50.227800 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.227389 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:50.230104 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.230080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"f060b18598455d3ba5734d063cb83557448d8a1d2b5f63837b82a631547f4f13"} Apr 16 18:12:50.230216 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.230111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"20224b5201604d6602d5f6d617ee7e470656f5b2843368f31ba992f529a0a3c2"} Apr 16 18:12:50.230216 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.230124 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" event={"ID":"b942ff78-36e1-45e2-bf1f-bec607da9918","Type":"ContainerStarted","Data":"6386616040c74c0ae351a3cf566ba3434a1b162611bb2c49ef8bc727063d54b4"} Apr 16 18:12:50.230458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.230443 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:50.232792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.232773 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" Apr 16 18:12:50.244620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.244580 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwbtf" podStartSLOduration=1.875683858 podStartE2EDuration="3.244568609s" podCreationTimestamp="2026-04-16 18:12:47 +0000 UTC" firstStartedPulling="2026-04-16 18:12:48.751067084 +0000 UTC m=+193.675617726" lastFinishedPulling="2026-04-16 18:12:50.119951835 +0000 UTC m=+195.044502477" observedRunningTime="2026-04-16 18:12:50.243187744 +0000 UTC m=+195.167738396" watchObservedRunningTime="2026-04-16 18:12:50.244568609 +0000 UTC m=+195.169119238" Apr 16 18:12:50.270354 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:50.270311 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" podStartSLOduration=2.057564582 podStartE2EDuration="5.270294405s" podCreationTimestamp="2026-04-16 18:12:45 +0000 UTC" firstStartedPulling="2026-04-16 18:12:46.3998141 +0000 UTC m=+191.324364744" lastFinishedPulling="2026-04-16 18:12:49.612543939 +0000 UTC m=+194.537094567" observedRunningTime="2026-04-16 18:12:50.269370165 +0000 UTC m=+195.193920839" watchObservedRunningTime="2026-04-16 18:12:50.270294405 +0000 UTC m=+195.194845058" Apr 16 18:12:56.239364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:56.239336 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c4899b649-g46dq" Apr 16 18:12:58.597389 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:58.597351 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:58.597886 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:58.597432 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:58.601997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:58.601979 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:59.259852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:59.259824 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:12:59.326277 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:12:59.326236 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-784788794d-wmkr8"] Apr 16 18:13:09.285593 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:09.285558 2571 generic.go:358] "Generic (PLEG): container finished" podID="7cd24549-bac0-49c2-ab16-4e779bd2e01e" containerID="22975511551a798bcb74ce1c7a3270903051e604c4528d0a0b38227c1926cdd7" exitCode=0 Apr 16 18:13:09.286007 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:09.285634 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" event={"ID":"7cd24549-bac0-49c2-ab16-4e779bd2e01e","Type":"ContainerDied","Data":"22975511551a798bcb74ce1c7a3270903051e604c4528d0a0b38227c1926cdd7"} Apr 16 18:13:09.286007 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:09.285963 2571 scope.go:117] "RemoveContainer" containerID="22975511551a798bcb74ce1c7a3270903051e604c4528d0a0b38227c1926cdd7" Apr 16 18:13:10.289570 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:10.289537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-xwqsm" event={"ID":"7cd24549-bac0-49c2-ab16-4e779bd2e01e","Type":"ContainerStarted","Data":"e4cb92fca30cf38adff95fd0f7fd6ca5a84b09c316b888cfadd5a0384e731435"} Apr 16 18:13:13.550058 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.550016 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" podUID="08bf3f64-b45f-4345-93cd-34468773149c" containerName="registry" containerID="cri-o://8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f" gracePeriod=30 Apr 16 18:13:13.784427 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.784404 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:13:13.908806 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908674 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-installation-pull-secrets\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.908806 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908764 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-registry-certificates\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.908806 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908812 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-bound-sa-token\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.909089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908852 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-trusted-ca\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.909089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908871 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.909089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908903 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-image-registry-private-configuration\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.909089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908942 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2jqx\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-kube-api-access-j2jqx\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.909089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.908967 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08bf3f64-b45f-4345-93cd-34468773149c-ca-trust-extracted\") pod \"08bf3f64-b45f-4345-93cd-34468773149c\" (UID: \"08bf3f64-b45f-4345-93cd-34468773149c\") " Apr 16 18:13:13.909623 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.909565 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:13.909828 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.909807 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:13.911810 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.911772 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-kube-api-access-j2jqx" (OuterVolumeSpecName: "kube-api-access-j2jqx") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "kube-api-access-j2jqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:13.911916 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.911834 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:13.911916 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.911876 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:13.912030 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.912000 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:13.912123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.912061 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:13.917379 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:13.917351 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08bf3f64-b45f-4345-93cd-34468773149c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "08bf3f64-b45f-4345-93cd-34468773149c" (UID: "08bf3f64-b45f-4345-93cd-34468773149c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:14.010473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010436 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-installation-pull-secrets\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010466 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-registry-certificates\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010476 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-bound-sa-token\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010486 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08bf3f64-b45f-4345-93cd-34468773149c-trusted-ca\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010496 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-registry-tls\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010505 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/08bf3f64-b45f-4345-93cd-34468773149c-image-registry-private-configuration\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010514 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2jqx\" (UniqueName: \"kubernetes.io/projected/08bf3f64-b45f-4345-93cd-34468773149c-kube-api-access-j2jqx\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.010720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.010523 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08bf3f64-b45f-4345-93cd-34468773149c-ca-trust-extracted\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:14.302332 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.302301 2571 generic.go:358] "Generic (PLEG): container finished" podID="08bf3f64-b45f-4345-93cd-34468773149c" containerID="8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f" exitCode=0 Apr 16 18:13:14.302514 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.302366 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" Apr 16 18:13:14.302514 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.302391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" event={"ID":"08bf3f64-b45f-4345-93cd-34468773149c","Type":"ContainerDied","Data":"8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f"} Apr 16 18:13:14.302514 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.302434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69f768f794-7jzj8" event={"ID":"08bf3f64-b45f-4345-93cd-34468773149c","Type":"ContainerDied","Data":"8511eb3fc3012afd582d612cfc70a1257910cebda717abc910bc6b7bfa82c9a5"} Apr 16 18:13:14.302514 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.302454 2571 scope.go:117] "RemoveContainer" containerID="8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f" Apr 16 18:13:14.311513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.311497 2571 scope.go:117] "RemoveContainer" containerID="8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f" Apr 16 18:13:14.311784 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:13:14.311763 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f\": container with ID starting with 8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f not found: ID does not exist" containerID="8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f" Apr 16 18:13:14.311853 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.311795 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f"} err="failed to get container status \"8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f\": rpc error: code = NotFound desc = could not find container \"8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f\": container with ID starting with 8a3eeee88f1163148f7a87fa3abb9e096e2cc418332be58eb302d77eebffcb3f not found: ID does not exist" Apr 16 18:13:14.324065 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.324044 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69f768f794-7jzj8"] Apr 16 18:13:14.326403 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:14.326381 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69f768f794-7jzj8"] Apr 16 18:13:15.640380 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:15.640347 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08bf3f64-b45f-4345-93cd-34468773149c" path="/var/lib/kubelet/pods/08bf3f64-b45f-4345-93cd-34468773149c/volumes" Apr 16 18:13:24.349860 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.349795 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-784788794d-wmkr8" podUID="8779f5f7-abfa-4e15-82cd-ede83b6785e9" containerName="console" containerID="cri-o://8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661" gracePeriod=15 Apr 16 18:13:24.595030 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.595001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-784788794d-wmkr8_8779f5f7-abfa-4e15-82cd-ede83b6785e9/console/0.log" Apr 16 18:13:24.595149 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.595060 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:13:24.700235 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700204 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-oauth-config\") pod \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " Apr 16 18:13:24.700386 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700254 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-service-ca\") pod \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " Apr 16 18:13:24.700386 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700281 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-oauth-serving-cert\") pod \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " Apr 16 18:13:24.700386 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700305 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-serving-cert\") pod \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " Apr 16 18:13:24.700386 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700355 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-config\") pod \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " Apr 16 18:13:24.700551 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700388 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jptkr\" (UniqueName: \"kubernetes.io/projected/8779f5f7-abfa-4e15-82cd-ede83b6785e9-kube-api-access-jptkr\") pod \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\" (UID: \"8779f5f7-abfa-4e15-82cd-ede83b6785e9\") " Apr 16 18:13:24.700781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700745 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8779f5f7-abfa-4e15-82cd-ede83b6785e9" (UID: "8779f5f7-abfa-4e15-82cd-ede83b6785e9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:24.700781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700768 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-service-ca" (OuterVolumeSpecName: "service-ca") pod "8779f5f7-abfa-4e15-82cd-ede83b6785e9" (UID: "8779f5f7-abfa-4e15-82cd-ede83b6785e9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:24.700931 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.700834 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-config" (OuterVolumeSpecName: "console-config") pod "8779f5f7-abfa-4e15-82cd-ede83b6785e9" (UID: "8779f5f7-abfa-4e15-82cd-ede83b6785e9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:24.702607 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.702577 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8779f5f7-abfa-4e15-82cd-ede83b6785e9" (UID: "8779f5f7-abfa-4e15-82cd-ede83b6785e9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:24.702798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.702602 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8779f5f7-abfa-4e15-82cd-ede83b6785e9" (UID: "8779f5f7-abfa-4e15-82cd-ede83b6785e9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:24.702798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.702610 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8779f5f7-abfa-4e15-82cd-ede83b6785e9-kube-api-access-jptkr" (OuterVolumeSpecName: "kube-api-access-jptkr") pod "8779f5f7-abfa-4e15-82cd-ede83b6785e9" (UID: "8779f5f7-abfa-4e15-82cd-ede83b6785e9"). InnerVolumeSpecName "kube-api-access-jptkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:24.801558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.801523 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-oauth-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:24.801558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.801553 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:24.801558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.801563 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:24.801803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.801572 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jptkr\" (UniqueName: \"kubernetes.io/projected/8779f5f7-abfa-4e15-82cd-ede83b6785e9-kube-api-access-jptkr\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:24.801803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.801580 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8779f5f7-abfa-4e15-82cd-ede83b6785e9-console-oauth-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:24.801803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:24.801589 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8779f5f7-abfa-4e15-82cd-ede83b6785e9-service-ca\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:13:25.337828 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.337799 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-784788794d-wmkr8_8779f5f7-abfa-4e15-82cd-ede83b6785e9/console/0.log" Apr 16 18:13:25.337993 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.337838 2571 generic.go:358] "Generic (PLEG): container finished" podID="8779f5f7-abfa-4e15-82cd-ede83b6785e9" containerID="8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661" exitCode=2 Apr 16 18:13:25.337993 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.337870 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-784788794d-wmkr8" event={"ID":"8779f5f7-abfa-4e15-82cd-ede83b6785e9","Type":"ContainerDied","Data":"8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661"} Apr 16 18:13:25.337993 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.337909 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-784788794d-wmkr8" event={"ID":"8779f5f7-abfa-4e15-82cd-ede83b6785e9","Type":"ContainerDied","Data":"52142eb788b966850392242eb42715b2a1553059994a95b960941cabfb4b5e80"} Apr 16 18:13:25.337993 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.337920 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-784788794d-wmkr8" Apr 16 18:13:25.337993 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.337928 2571 scope.go:117] "RemoveContainer" containerID="8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661" Apr 16 18:13:25.345973 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.345954 2571 scope.go:117] "RemoveContainer" containerID="8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661" Apr 16 18:13:25.346235 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:13:25.346214 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661\": container with ID starting with 8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661 not found: ID does not exist" containerID="8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661" Apr 16 18:13:25.346288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.346243 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661"} err="failed to get container status \"8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661\": rpc error: code = NotFound desc = could not find container \"8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661\": container with ID starting with 8eaee75d6d7bd4de2d06faaa9677c9dcf06cdbd231e498f389beb1bbfa482661 not found: ID does not exist" Apr 16 18:13:25.363065 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.363040 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-784788794d-wmkr8"] Apr 16 18:13:25.368631 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.368604 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-784788794d-wmkr8"] Apr 16 18:13:25.640245 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:25.640170 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8779f5f7-abfa-4e15-82cd-ede83b6785e9" path="/var/lib/kubelet/pods/8779f5f7-abfa-4e15-82cd-ede83b6785e9/volumes" Apr 16 18:13:29.352349 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:29.352262 2571 generic.go:358] "Generic (PLEG): container finished" podID="60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3" containerID="c8323de528ee08d5dd405b2e928d1741d17cb4e68c2da6fa1322e86a20e463e7" exitCode=0 Apr 16 18:13:29.352349 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:29.352335 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" event={"ID":"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3","Type":"ContainerDied","Data":"c8323de528ee08d5dd405b2e928d1741d17cb4e68c2da6fa1322e86a20e463e7"} Apr 16 18:13:29.352760 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:29.352729 2571 scope.go:117] "RemoveContainer" containerID="c8323de528ee08d5dd405b2e928d1741d17cb4e68c2da6fa1322e86a20e463e7" Apr 16 18:13:30.362192 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:30.362157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-98zlj" event={"ID":"60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3","Type":"ContainerStarted","Data":"fb3d71cf20a2ab6a1abdfb0de60ed6ef1ba84e537ed6019408539db6c7283b9e"} Apr 16 18:13:47.385432 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:47.385328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:13:47.387738 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:47.387713 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d27531f-08c4-4c67-974c-31cacc77b8be-metrics-certs\") pod \"network-metrics-daemon-lnrzm\" (UID: \"9d27531f-08c4-4c67-974c-31cacc77b8be\") " pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:13:47.542198 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:47.542172 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kjmpp\"" Apr 16 18:13:47.550148 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:47.550128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnrzm" Apr 16 18:13:47.674000 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:47.673917 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lnrzm"] Apr 16 18:13:47.677485 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:13:47.677459 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d27531f_08c4_4c67_974c_31cacc77b8be.slice/crio-a7e4ccb049057116bf2b383a0de14b0579958f2d0c564941bed1fe269b59c350 WatchSource:0}: Error finding container a7e4ccb049057116bf2b383a0de14b0579958f2d0c564941bed1fe269b59c350: Status 404 returned error can't find the container with id a7e4ccb049057116bf2b383a0de14b0579958f2d0c564941bed1fe269b59c350 Apr 16 18:13:48.417519 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:48.417484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lnrzm" event={"ID":"9d27531f-08c4-4c67-974c-31cacc77b8be","Type":"ContainerStarted","Data":"a7e4ccb049057116bf2b383a0de14b0579958f2d0c564941bed1fe269b59c350"} Apr 16 18:13:49.421725 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:49.421678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lnrzm" event={"ID":"9d27531f-08c4-4c67-974c-31cacc77b8be","Type":"ContainerStarted","Data":"4d21f9ef8a4499f13494fdcdfe48036ba20f3213671ee672ba5b225e0b2c34b8"} Apr 16 18:13:49.421725 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:49.421727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lnrzm" event={"ID":"9d27531f-08c4-4c67-974c-31cacc77b8be","Type":"ContainerStarted","Data":"c5700573bae1741f0cd11b0f3c19b748f32f272b173a7e160917ef0136d6a9c6"} Apr 16 18:13:49.445161 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:13:49.445109 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lnrzm" podStartSLOduration=253.560680954 podStartE2EDuration="4m14.445093195s" podCreationTimestamp="2026-04-16 18:09:35 +0000 UTC" firstStartedPulling="2026-04-16 18:13:47.679337741 +0000 UTC m=+252.603888371" lastFinishedPulling="2026-04-16 18:13:48.563749983 +0000 UTC m=+253.488300612" observedRunningTime="2026-04-16 18:13:49.444607191 +0000 UTC m=+254.369157857" watchObservedRunningTime="2026-04-16 18:13:49.445093195 +0000 UTC m=+254.369643846" Apr 16 18:14:10.787640 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.787603 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-94b7cc454-2kx9l"] Apr 16 18:14:10.788261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.788039 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08bf3f64-b45f-4345-93cd-34468773149c" containerName="registry" Apr 16 18:14:10.788261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.788058 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="08bf3f64-b45f-4345-93cd-34468773149c" containerName="registry" Apr 16 18:14:10.788261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.788087 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8779f5f7-abfa-4e15-82cd-ede83b6785e9" containerName="console" Apr 16 18:14:10.788261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.788096 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8779f5f7-abfa-4e15-82cd-ede83b6785e9" containerName="console" Apr 16 18:14:10.788261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.788178 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="08bf3f64-b45f-4345-93cd-34468773149c" containerName="registry" Apr 16 18:14:10.788261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.788195 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8779f5f7-abfa-4e15-82cd-ede83b6785e9" containerName="console" Apr 16 18:14:10.790946 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.790921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.800385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.800359 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-94b7cc454-2kx9l"] Apr 16 18:14:10.879608 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-config\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.879824 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-serving-cert\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.879824 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-service-ca\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.879824 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879710 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-oauth-config\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.879824 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-oauth-serving-cert\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.880013 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8tbp\" (UniqueName: \"kubernetes.io/projected/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-kube-api-access-f8tbp\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.880013 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.879872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-trusted-ca-bundle\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981142 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-oauth-serving-cert\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981142 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8tbp\" (UniqueName: \"kubernetes.io/projected/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-kube-api-access-f8tbp\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981395 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-trusted-ca-bundle\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981395 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-config\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981395 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-serving-cert\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981395 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-service-ca\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981395 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981333 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-oauth-config\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.981930 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.981901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-oauth-serving-cert\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.982069 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.982047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-config\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.982125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.982050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-service-ca\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.982216 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.982198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-trusted-ca-bundle\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.983742 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.983717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-oauth-config\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.983909 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.983893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-serving-cert\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:10.989241 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:10.989223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8tbp\" (UniqueName: \"kubernetes.io/projected/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-kube-api-access-f8tbp\") pod \"console-94b7cc454-2kx9l\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:11.101460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:11.101365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:11.224726 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:11.224620 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-94b7cc454-2kx9l"] Apr 16 18:14:11.227351 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:14:11.227319 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1313f49_5b4a_4a78_aaf2_578aa2c08d49.slice/crio-30710c0fafa52308288d08e68d776aca484baee70ac037335646f792ea1a960a WatchSource:0}: Error finding container 30710c0fafa52308288d08e68d776aca484baee70ac037335646f792ea1a960a: Status 404 returned error can't find the container with id 30710c0fafa52308288d08e68d776aca484baee70ac037335646f792ea1a960a Apr 16 18:14:11.488052 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:11.488014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94b7cc454-2kx9l" event={"ID":"c1313f49-5b4a-4a78-aaf2-578aa2c08d49","Type":"ContainerStarted","Data":"2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544"} Apr 16 18:14:11.488189 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:11.488058 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94b7cc454-2kx9l" event={"ID":"c1313f49-5b4a-4a78-aaf2-578aa2c08d49","Type":"ContainerStarted","Data":"30710c0fafa52308288d08e68d776aca484baee70ac037335646f792ea1a960a"} Apr 16 18:14:11.507766 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:11.507718 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-94b7cc454-2kx9l" podStartSLOduration=1.507699914 podStartE2EDuration="1.507699914s" podCreationTimestamp="2026-04-16 18:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:11.506153786 +0000 UTC m=+276.430704436" watchObservedRunningTime="2026-04-16 18:14:11.507699914 +0000 UTC m=+276.432250563" Apr 16 18:14:19.945918 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:19.945885 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-94b7cc454-2kx9l"] Apr 16 18:14:19.977987 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:19.977951 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-87977c59d-f5s79"] Apr 16 18:14:19.980127 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:19.980106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:19.992960 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:19.992933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87977c59d-f5s79"] Apr 16 18:14:20.161881 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.161846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-serving-cert\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.161881 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.161884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-oauth-config\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.162113 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.161900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq85j\" (UniqueName: \"kubernetes.io/projected/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-kube-api-access-dq85j\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.162113 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.162019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-service-ca\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.162113 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.162068 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-trusted-ca-bundle\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.162113 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.162104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-config\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.162263 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.162119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-oauth-serving-cert\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-service-ca\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-trusted-ca-bundle\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-config\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-oauth-serving-cert\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-serving-cert\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-oauth-config\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq85j\" (UniqueName: \"kubernetes.io/projected/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-kube-api-access-dq85j\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.263926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-service-ca\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.264039 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-config\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.264039 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.263946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-oauth-serving-cert\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.264128 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.264114 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-trusted-ca-bundle\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.265877 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.265848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-oauth-config\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.265957 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.265856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-serving-cert\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.273641 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.273621 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq85j\" (UniqueName: \"kubernetes.io/projected/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-kube-api-access-dq85j\") pod \"console-87977c59d-f5s79\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.288817 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.288794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:20.428804 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.428767 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87977c59d-f5s79"] Apr 16 18:14:20.432294 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:14:20.432267 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b107aa_f6da_49ce_abb2_9db8f9af18ab.slice/crio-baaa685ba2e756fd3dfb9af21854adb5fe36eda07ff66b0acecb1580c7a0ca91 WatchSource:0}: Error finding container baaa685ba2e756fd3dfb9af21854adb5fe36eda07ff66b0acecb1580c7a0ca91: Status 404 returned error can't find the container with id baaa685ba2e756fd3dfb9af21854adb5fe36eda07ff66b0acecb1580c7a0ca91 Apr 16 18:14:20.518539 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.518432 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87977c59d-f5s79" event={"ID":"e1b107aa-f6da-49ce-abb2-9db8f9af18ab","Type":"ContainerStarted","Data":"35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d"} Apr 16 18:14:20.518539 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.518480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87977c59d-f5s79" event={"ID":"e1b107aa-f6da-49ce-abb2-9db8f9af18ab","Type":"ContainerStarted","Data":"baaa685ba2e756fd3dfb9af21854adb5fe36eda07ff66b0acecb1580c7a0ca91"} Apr 16 18:14:20.537557 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:20.537502 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-87977c59d-f5s79" podStartSLOduration=1.5374880850000001 podStartE2EDuration="1.537488085s" podCreationTimestamp="2026-04-16 18:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:20.536016369 +0000 UTC m=+285.460567034" watchObservedRunningTime="2026-04-16 18:14:20.537488085 +0000 UTC m=+285.462038737" Apr 16 18:14:21.102347 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:21.102318 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:30.289674 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:30.289609 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:30.289674 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:30.289675 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:30.294343 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:30.294317 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:30.548932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:30.548843 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:14:30.600239 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:30.600209 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86dcb9b989-jpc8v"] Apr 16 18:14:35.480859 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:35.480821 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:14:35.481279 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:35.481202 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:14:35.493105 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:35.493080 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:14:44.964662 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:44.964602 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-94b7cc454-2kx9l" podUID="c1313f49-5b4a-4a78-aaf2-578aa2c08d49" containerName="console" containerID="cri-o://2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544" gracePeriod=15 Apr 16 18:14:45.190066 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.190045 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-94b7cc454-2kx9l_c1313f49-5b4a-4a78-aaf2-578aa2c08d49/console/0.log" Apr 16 18:14:45.190185 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.190105 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:45.266215 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266136 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-service-ca\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266220 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-trusted-ca-bundle\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266246 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-config\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266270 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-oauth-config\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266287 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8tbp\" (UniqueName: \"kubernetes.io/projected/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-kube-api-access-f8tbp\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266466 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-oauth-serving-cert\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266565 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266512 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-serving-cert\") pod \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\" (UID: \"c1313f49-5b4a-4a78-aaf2-578aa2c08d49\") " Apr 16 18:14:45.266621 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266594 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-service-ca" (OuterVolumeSpecName: "service-ca") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:45.266807 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266785 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-service-ca\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.266807 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266785 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:45.266934 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266793 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-config" (OuterVolumeSpecName: "console-config") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:45.266934 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.266847 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:45.268458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.268432 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:45.268541 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.268472 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:45.268541 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.268510 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-kube-api-access-f8tbp" (OuterVolumeSpecName: "kube-api-access-f8tbp") pod "c1313f49-5b4a-4a78-aaf2-578aa2c08d49" (UID: "c1313f49-5b4a-4a78-aaf2-578aa2c08d49"). InnerVolumeSpecName "kube-api-access-f8tbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:45.367735 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.367681 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-trusted-ca-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.367735 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.367734 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.367980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.367746 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-oauth-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.367980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.367762 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8tbp\" (UniqueName: \"kubernetes.io/projected/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-kube-api-access-f8tbp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.367980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.367774 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-oauth-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.367980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.367787 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1313f49-5b4a-4a78-aaf2-578aa2c08d49-console-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:45.588168 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.588080 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-94b7cc454-2kx9l_c1313f49-5b4a-4a78-aaf2-578aa2c08d49/console/0.log" Apr 16 18:14:45.588168 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.588121 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1313f49-5b4a-4a78-aaf2-578aa2c08d49" containerID="2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544" exitCode=2 Apr 16 18:14:45.588362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.588197 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94b7cc454-2kx9l" Apr 16 18:14:45.588362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.588222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94b7cc454-2kx9l" event={"ID":"c1313f49-5b4a-4a78-aaf2-578aa2c08d49","Type":"ContainerDied","Data":"2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544"} Apr 16 18:14:45.588362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.588267 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94b7cc454-2kx9l" event={"ID":"c1313f49-5b4a-4a78-aaf2-578aa2c08d49","Type":"ContainerDied","Data":"30710c0fafa52308288d08e68d776aca484baee70ac037335646f792ea1a960a"} Apr 16 18:14:45.588362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.588287 2571 scope.go:117] "RemoveContainer" containerID="2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544" Apr 16 18:14:45.597081 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.597064 2571 scope.go:117] "RemoveContainer" containerID="2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544" Apr 16 18:14:45.597328 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:14:45.597311 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544\": container with ID starting with 2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544 not found: ID does not exist" containerID="2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544" Apr 16 18:14:45.597371 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.597337 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544"} err="failed to get container status \"2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544\": rpc error: code = NotFound desc = could not find container \"2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544\": container with ID starting with 2777163a6f15e6e4f1752435f905a6bc3763cc84d419652f25098374091bd544 not found: ID does not exist" Apr 16 18:14:45.610944 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.610918 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-94b7cc454-2kx9l"] Apr 16 18:14:45.616287 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.616264 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-94b7cc454-2kx9l"] Apr 16 18:14:45.641242 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:45.641212 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1313f49-5b4a-4a78-aaf2-578aa2c08d49" path="/var/lib/kubelet/pods/c1313f49-5b4a-4a78-aaf2-578aa2c08d49/volumes" Apr 16 18:14:55.621001 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.620957 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86dcb9b989-jpc8v" podUID="5892585b-0a69-4e74-ac1c-a39d29ce132e" containerName="console" containerID="cri-o://77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd" gracePeriod=15 Apr 16 18:14:55.856746 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.856724 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86dcb9b989-jpc8v_5892585b-0a69-4e74-ac1c-a39d29ce132e/console/0.log" Apr 16 18:14:55.856877 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.856787 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:14:55.956574 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956540 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-oauth-serving-cert\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.956768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956595 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-config\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.956768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956622 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kqmv\" (UniqueName: \"kubernetes.io/projected/5892585b-0a69-4e74-ac1c-a39d29ce132e-kube-api-access-9kqmv\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.956768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956646 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-trusted-ca-bundle\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.956768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956671 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-service-ca\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.956768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956733 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-serving-cert\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.957025 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.956782 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-oauth-config\") pod \"5892585b-0a69-4e74-ac1c-a39d29ce132e\" (UID: \"5892585b-0a69-4e74-ac1c-a39d29ce132e\") " Apr 16 18:14:55.957084 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.957035 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-config" (OuterVolumeSpecName: "console-config") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:55.957084 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.957061 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:55.957233 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.957201 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-service-ca" (OuterVolumeSpecName: "service-ca") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:55.957391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.957374 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:55.958921 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.958886 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:55.958921 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.958895 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5892585b-0a69-4e74-ac1c-a39d29ce132e-kube-api-access-9kqmv" (OuterVolumeSpecName: "kube-api-access-9kqmv") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "kube-api-access-9kqmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:55.959037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:55.958926 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5892585b-0a69-4e74-ac1c-a39d29ce132e" (UID: "5892585b-0a69-4e74-ac1c-a39d29ce132e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:56.057596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057542 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-oauth-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.057596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057587 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.057596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057598 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kqmv\" (UniqueName: \"kubernetes.io/projected/5892585b-0a69-4e74-ac1c-a39d29ce132e-kube-api-access-9kqmv\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.057596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057608 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-trusted-ca-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.057596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057617 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5892585b-0a69-4e74-ac1c-a39d29ce132e-service-ca\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.057926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057634 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.057926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.057642 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5892585b-0a69-4e74-ac1c-a39d29ce132e-console-oauth-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:14:56.622067 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.622037 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86dcb9b989-jpc8v_5892585b-0a69-4e74-ac1c-a39d29ce132e/console/0.log" Apr 16 18:14:56.622497 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.622080 2571 generic.go:358] "Generic (PLEG): container finished" podID="5892585b-0a69-4e74-ac1c-a39d29ce132e" containerID="77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd" exitCode=2 Apr 16 18:14:56.622497 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.622154 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86dcb9b989-jpc8v" Apr 16 18:14:56.622497 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.622168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86dcb9b989-jpc8v" event={"ID":"5892585b-0a69-4e74-ac1c-a39d29ce132e","Type":"ContainerDied","Data":"77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd"} Apr 16 18:14:56.622497 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.622207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86dcb9b989-jpc8v" event={"ID":"5892585b-0a69-4e74-ac1c-a39d29ce132e","Type":"ContainerDied","Data":"d4513fecc78c5eefa159721e6c474f1a75a93f4a16b5e84bb1f7b9fd84447b2b"} Apr 16 18:14:56.622497 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.622227 2571 scope.go:117] "RemoveContainer" containerID="77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd" Apr 16 18:14:56.630229 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.630209 2571 scope.go:117] "RemoveContainer" containerID="77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd" Apr 16 18:14:56.630471 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:14:56.630452 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd\": container with ID starting with 77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd not found: ID does not exist" containerID="77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd" Apr 16 18:14:56.630527 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.630479 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd"} err="failed to get container status \"77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd\": rpc error: code = NotFound desc = could not find container \"77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd\": container with ID starting with 77bdcb8d8c755f8ace6122815bb49db940a230de634dd8654cf4a34486f338bd not found: ID does not exist" Apr 16 18:14:56.646301 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.646276 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86dcb9b989-jpc8v"] Apr 16 18:14:56.649752 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:56.649729 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86dcb9b989-jpc8v"] Apr 16 18:14:57.640157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:14:57.640126 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5892585b-0a69-4e74-ac1c-a39d29ce132e" path="/var/lib/kubelet/pods/5892585b-0a69-4e74-ac1c-a39d29ce132e/volumes" Apr 16 18:17:22.984869 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.984824 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr"] Apr 16 18:17:22.985504 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.985485 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5892585b-0a69-4e74-ac1c-a39d29ce132e" containerName="console" Apr 16 18:17:22.985621 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.985508 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5892585b-0a69-4e74-ac1c-a39d29ce132e" containerName="console" Apr 16 18:17:22.985621 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.985549 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1313f49-5b4a-4a78-aaf2-578aa2c08d49" containerName="console" Apr 16 18:17:22.985621 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.985560 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1313f49-5b4a-4a78-aaf2-578aa2c08d49" containerName="console" Apr 16 18:17:22.985825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.985718 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5892585b-0a69-4e74-ac1c-a39d29ce132e" containerName="console" Apr 16 18:17:22.985825 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.985749 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1313f49-5b4a-4a78-aaf2-578aa2c08d49" containerName="console" Apr 16 18:17:22.989366 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.989334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:22.991778 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.991754 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.992473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.992456 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdthj\"" Apr 16 18:17:22.992577 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:22.992463 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:17:23.004856 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.004830 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr"] Apr 16 18:17:23.099881 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.099845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9fj\" (UniqueName: \"kubernetes.io/projected/f10166f3-05c4-4e9e-9472-962b590b31a2-kube-api-access-lf9fj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.099881 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.099889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.100105 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.099971 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.200983 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.200935 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.201186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.201035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9fj\" (UniqueName: \"kubernetes.io/projected/f10166f3-05c4-4e9e-9472-962b590b31a2-kube-api-access-lf9fj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.201186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.201069 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.201362 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.201339 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.201429 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.201398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.210510 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.210472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9fj\" (UniqueName: \"kubernetes.io/projected/f10166f3-05c4-4e9e-9472-962b590b31a2-kube-api-access-lf9fj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.298493 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.298397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:23.419653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.419620 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr"] Apr 16 18:17:23.422669 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:17:23.422641 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf10166f3_05c4_4e9e_9472_962b590b31a2.slice/crio-69fa8162c783d2a8cc2b7b2d17893bf8e464431198ec7a08267653139295293e WatchSource:0}: Error finding container 69fa8162c783d2a8cc2b7b2d17893bf8e464431198ec7a08267653139295293e: Status 404 returned error can't find the container with id 69fa8162c783d2a8cc2b7b2d17893bf8e464431198ec7a08267653139295293e Apr 16 18:17:23.424398 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:23.424382 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:24.048282 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:24.048247 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" event={"ID":"f10166f3-05c4-4e9e-9472-962b590b31a2","Type":"ContainerStarted","Data":"69fa8162c783d2a8cc2b7b2d17893bf8e464431198ec7a08267653139295293e"} Apr 16 18:17:29.065517 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:29.065477 2571 generic.go:358] "Generic (PLEG): container finished" podID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerID="c48b90579dede4189f59f8ae55516d0f9eee2a74270831e189d6a0a0f80778c3" exitCode=0 Apr 16 18:17:29.066023 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:29.065551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" event={"ID":"f10166f3-05c4-4e9e-9472-962b590b31a2","Type":"ContainerDied","Data":"c48b90579dede4189f59f8ae55516d0f9eee2a74270831e189d6a0a0f80778c3"} Apr 16 18:17:32.076712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:32.076648 2571 generic.go:358] "Generic (PLEG): container finished" podID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerID="1cd8c369d551cd32696d6faab233ac0c4d224abaf224dbd29472b4f2bb732fa9" exitCode=0 Apr 16 18:17:32.077085 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:32.076735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" event={"ID":"f10166f3-05c4-4e9e-9472-962b590b31a2","Type":"ContainerDied","Data":"1cd8c369d551cd32696d6faab233ac0c4d224abaf224dbd29472b4f2bb732fa9"} Apr 16 18:17:38.098320 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:38.098232 2571 generic.go:358] "Generic (PLEG): container finished" podID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerID="7c35b926aad84edb68e102f15cc6cbc595e09ef3bb8d1502beda2b2def3c7c1c" exitCode=0 Apr 16 18:17:38.098755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:38.098314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" event={"ID":"f10166f3-05c4-4e9e-9472-962b590b31a2","Type":"ContainerDied","Data":"7c35b926aad84edb68e102f15cc6cbc595e09ef3bb8d1502beda2b2def3c7c1c"} Apr 16 18:17:39.223097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.223074 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:39.342989 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.342947 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-util\") pod \"f10166f3-05c4-4e9e-9472-962b590b31a2\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " Apr 16 18:17:39.343166 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.343023 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf9fj\" (UniqueName: \"kubernetes.io/projected/f10166f3-05c4-4e9e-9472-962b590b31a2-kube-api-access-lf9fj\") pod \"f10166f3-05c4-4e9e-9472-962b590b31a2\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " Apr 16 18:17:39.343166 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.343055 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-bundle\") pod \"f10166f3-05c4-4e9e-9472-962b590b31a2\" (UID: \"f10166f3-05c4-4e9e-9472-962b590b31a2\") " Apr 16 18:17:39.343625 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.343598 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-bundle" (OuterVolumeSpecName: "bundle") pod "f10166f3-05c4-4e9e-9472-962b590b31a2" (UID: "f10166f3-05c4-4e9e-9472-962b590b31a2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:17:39.345222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.345193 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10166f3-05c4-4e9e-9472-962b590b31a2-kube-api-access-lf9fj" (OuterVolumeSpecName: "kube-api-access-lf9fj") pod "f10166f3-05c4-4e9e-9472-962b590b31a2" (UID: "f10166f3-05c4-4e9e-9472-962b590b31a2"). InnerVolumeSpecName "kube-api-access-lf9fj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:17:39.347192 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.347166 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-util" (OuterVolumeSpecName: "util") pod "f10166f3-05c4-4e9e-9472-962b590b31a2" (UID: "f10166f3-05c4-4e9e-9472-962b590b31a2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:17:39.444019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.443939 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lf9fj\" (UniqueName: \"kubernetes.io/projected/f10166f3-05c4-4e9e-9472-962b590b31a2-kube-api-access-lf9fj\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:17:39.444019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.443968 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:17:39.444019 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:39.443978 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f10166f3-05c4-4e9e-9472-962b590b31a2-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:17:40.105976 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:40.105943 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" event={"ID":"f10166f3-05c4-4e9e-9472-962b590b31a2","Type":"ContainerDied","Data":"69fa8162c783d2a8cc2b7b2d17893bf8e464431198ec7a08267653139295293e"} Apr 16 18:17:40.105976 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:40.105976 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69fa8162c783d2a8cc2b7b2d17893bf8e464431198ec7a08267653139295293e" Apr 16 18:17:40.106197 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:40.105976 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4s2mr" Apr 16 18:17:48.889119 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889079 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zbkvf"] Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889567 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="util" Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889594 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="util" Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889610 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="pull" Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889619 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="pull" Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889635 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="extract" Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889644 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="extract" Apr 16 18:17:48.889767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.889760 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f10166f3-05c4-4e9e-9472-962b590b31a2" containerName="extract" Apr 16 18:17:48.902244 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.902216 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:48.903009 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.902974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zbkvf"] Apr 16 18:17:48.908177 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.908156 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:17:48.908296 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.908198 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.909847 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.909814 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:17:48.909964 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.909886 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:17:48.910240 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.910223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-9pw79\"" Apr 16 18:17:48.910645 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:48.910630 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:17:49.019901 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.019863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/65b59782-5233-4c2d-a0fb-df84a646514b-cabundle0\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.020079 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.019916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsg4k\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-kube-api-access-bsg4k\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.020079 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.019943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.121245 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.121204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/65b59782-5233-4c2d-a0fb-df84a646514b-cabundle0\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.121403 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.121271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsg4k\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-kube-api-access-bsg4k\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.121403 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.121299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.121472 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.121420 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:17:49.121472 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.121432 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:17:49.121472 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.121441 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbkvf: references non-existent secret key: ca.crt Apr 16 18:17:49.121584 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.121496 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates podName:65b59782-5233-4c2d-a0fb-df84a646514b nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.621481187 +0000 UTC m=+494.546031815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates") pod "keda-operator-ffbb595cb-zbkvf" (UID: "65b59782-5233-4c2d-a0fb-df84a646514b") : references non-existent secret key: ca.crt Apr 16 18:17:49.121867 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.121847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/65b59782-5233-4c2d-a0fb-df84a646514b-cabundle0\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.130422 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.130399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsg4k\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-kube-api-access-bsg4k\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.200706 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.200658 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x"] Apr 16 18:17:49.214785 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.214759 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x"] Apr 16 18:17:49.214942 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.214887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.217586 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.217559 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:17:49.322758 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.322713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg4m5\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-kube-api-access-mg4m5\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.322758 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.322756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.322997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.322837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fc033f7c-d28c-4f77-958a-507fec58c55f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.424220 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.424173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg4m5\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-kube-api-access-mg4m5\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.424220 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.424227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.424432 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.424314 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:17:49.424432 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.424328 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:17:49.424432 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.424347 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x: references non-existent secret key: tls.crt Apr 16 18:17:49.424432 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.424350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fc033f7c-d28c-4f77-958a-507fec58c55f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.424432 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.424390 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates podName:fc033f7c-d28c-4f77-958a-507fec58c55f nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.924375513 +0000 UTC m=+494.848926142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates") pod "keda-metrics-apiserver-7c9f485588-jtz7x" (UID: "fc033f7c-d28c-4f77-958a-507fec58c55f") : references non-existent secret key: tls.crt Apr 16 18:17:49.424671 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.424655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fc033f7c-d28c-4f77-958a-507fec58c55f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.433205 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.433182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg4m5\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-kube-api-access-mg4m5\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.496642 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.496557 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-ntzw8"] Apr 16 18:17:49.519740 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.519711 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ntzw8"] Apr 16 18:17:49.519882 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.519810 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:49.522146 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.522125 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:17:49.625991 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.625953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-certificates\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:49.626175 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.626000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5962f\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-kube-api-access-5962f\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:49.626175 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.626051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:49.626175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.626149 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:17:49.626175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.626160 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:17:49.626175 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.626169 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbkvf: references non-existent secret key: ca.crt Apr 16 18:17:49.626359 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.626218 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates podName:65b59782-5233-4c2d-a0fb-df84a646514b nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.626204198 +0000 UTC m=+495.550754826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates") pod "keda-operator-ffbb595cb-zbkvf" (UID: "65b59782-5233-4c2d-a0fb-df84a646514b") : references non-existent secret key: ca.crt Apr 16 18:17:49.726943 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.726906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-certificates\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:49.727126 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.726952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5962f\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-kube-api-access-5962f\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:49.727126 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.727043 2571 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 18:17:49.727126 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.727069 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-ntzw8: secret "keda-admission-webhooks-certs" not found Apr 16 18:17:49.727238 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.727129 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-certificates podName:36e47f93-6cab-4b0c-a25a-cb29249e53b7 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.227113121 +0000 UTC m=+495.151663750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-certificates") pod "keda-admission-cf49989db-ntzw8" (UID: "36e47f93-6cab-4b0c-a25a-cb29249e53b7") : secret "keda-admission-webhooks-certs" not found Apr 16 18:17:49.736567 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.736542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5962f\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-kube-api-access-5962f\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:49.928429 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:49.928334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:49.928841 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.928470 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:17:49.928841 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.928490 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:17:49.928841 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.928507 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x: references non-existent secret key: tls.crt Apr 16 18:17:49.928841 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:49.928559 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates podName:fc033f7c-d28c-4f77-958a-507fec58c55f nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.928545319 +0000 UTC m=+495.853095947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates") pod "keda-metrics-apiserver-7c9f485588-jtz7x" (UID: "fc033f7c-d28c-4f77-958a-507fec58c55f") : references non-existent secret key: tls.crt Apr 16 18:17:50.230112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:50.230082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-certificates\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:50.232593 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:50.232568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36e47f93-6cab-4b0c-a25a-cb29249e53b7-certificates\") pod \"keda-admission-cf49989db-ntzw8\" (UID: \"36e47f93-6cab-4b0c-a25a-cb29249e53b7\") " pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:50.429541 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:50.429496 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:50.553673 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:50.553566 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ntzw8"] Apr 16 18:17:50.555906 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:17:50.555878 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e47f93_6cab_4b0c_a25a_cb29249e53b7.slice/crio-34aa1e722c857be419bf86bf9cd347fc6855243cc87c9b9e9d196d4447d84382 WatchSource:0}: Error finding container 34aa1e722c857be419bf86bf9cd347fc6855243cc87c9b9e9d196d4447d84382: Status 404 returned error can't find the container with id 34aa1e722c857be419bf86bf9cd347fc6855243cc87c9b9e9d196d4447d84382 Apr 16 18:17:50.632925 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:50.632887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:50.633124 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.633046 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:17:50.633124 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.633068 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:17:50.633124 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.633080 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbkvf: references non-existent secret key: ca.crt Apr 16 18:17:50.633304 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.633154 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates podName:65b59782-5233-4c2d-a0fb-df84a646514b nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.633133539 +0000 UTC m=+497.557684168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates") pod "keda-operator-ffbb595cb-zbkvf" (UID: "65b59782-5233-4c2d-a0fb-df84a646514b") : references non-existent secret key: ca.crt Apr 16 18:17:50.935181 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:50.935091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:50.935559 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.935202 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:17:50.935559 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.935213 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:17:50.935559 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.935230 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x: references non-existent secret key: tls.crt Apr 16 18:17:50.935559 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:50.935275 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates podName:fc033f7c-d28c-4f77-958a-507fec58c55f nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.935263054 +0000 UTC m=+497.859813682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates") pod "keda-metrics-apiserver-7c9f485588-jtz7x" (UID: "fc033f7c-d28c-4f77-958a-507fec58c55f") : references non-existent secret key: tls.crt Apr 16 18:17:51.136781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:51.136739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ntzw8" event={"ID":"36e47f93-6cab-4b0c-a25a-cb29249e53b7","Type":"ContainerStarted","Data":"34aa1e722c857be419bf86bf9cd347fc6855243cc87c9b9e9d196d4447d84382"} Apr 16 18:17:52.648877 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:52.648838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:52.649383 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.649120 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:17:52.649383 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.649140 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:17:52.649383 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.649149 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zbkvf: references non-existent secret key: ca.crt Apr 16 18:17:52.649383 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.649198 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates podName:65b59782-5233-4c2d-a0fb-df84a646514b nodeName:}" failed. No retries permitted until 2026-04-16 18:17:56.649181745 +0000 UTC m=+501.573732392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates") pod "keda-operator-ffbb595cb-zbkvf" (UID: "65b59782-5233-4c2d-a0fb-df84a646514b") : references non-existent secret key: ca.crt Apr 16 18:17:52.951465 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:52.951432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:52.951651 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.951598 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:17:52.951651 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.951623 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:17:52.951651 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.951645 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x: references non-existent secret key: tls.crt Apr 16 18:17:52.951830 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:17:52.951739 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates podName:fc033f7c-d28c-4f77-958a-507fec58c55f nodeName:}" failed. No retries permitted until 2026-04-16 18:17:56.951718676 +0000 UTC m=+501.876269310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates") pod "keda-metrics-apiserver-7c9f485588-jtz7x" (UID: "fc033f7c-d28c-4f77-958a-507fec58c55f") : references non-existent secret key: tls.crt Apr 16 18:17:53.145970 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:53.145922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ntzw8" event={"ID":"36e47f93-6cab-4b0c-a25a-cb29249e53b7","Type":"ContainerStarted","Data":"3827a2571c571aae1304a5410b78ceb1f5b8442b0dfbbbfbfd72fe43d38b7a25"} Apr 16 18:17:53.146324 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:53.146305 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:17:53.164453 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:53.164402 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-ntzw8" podStartSLOduration=2.562684296 podStartE2EDuration="4.164387373s" podCreationTimestamp="2026-04-16 18:17:49 +0000 UTC" firstStartedPulling="2026-04-16 18:17:50.557121531 +0000 UTC m=+495.481672161" lastFinishedPulling="2026-04-16 18:17:52.15882461 +0000 UTC m=+497.083375238" observedRunningTime="2026-04-16 18:17:53.162365061 +0000 UTC m=+498.086915712" watchObservedRunningTime="2026-04-16 18:17:53.164387373 +0000 UTC m=+498.088938024" Apr 16 18:17:56.684211 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:56.684171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:56.686809 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:56.686779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b59782-5233-4c2d-a0fb-df84a646514b-certificates\") pod \"keda-operator-ffbb595cb-zbkvf\" (UID: \"65b59782-5233-4c2d-a0fb-df84a646514b\") " pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:56.713564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:56.713523 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:17:56.845522 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:56.845497 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zbkvf"] Apr 16 18:17:56.847841 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:17:56.847806 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b59782_5233_4c2d_a0fb_df84a646514b.slice/crio-30939d448b7752b80308dc0e4cb79a59a736b8c81a8e50e0b3fceb42219dbb54 WatchSource:0}: Error finding container 30939d448b7752b80308dc0e4cb79a59a736b8c81a8e50e0b3fceb42219dbb54: Status 404 returned error can't find the container with id 30939d448b7752b80308dc0e4cb79a59a736b8c81a8e50e0b3fceb42219dbb54 Apr 16 18:17:56.986215 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:56.986164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:56.988736 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:56.988710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc033f7c-d28c-4f77-958a-507fec58c55f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtz7x\" (UID: \"fc033f7c-d28c-4f77-958a-507fec58c55f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:57.026516 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:57.026472 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:17:57.148591 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:57.148558 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x"] Apr 16 18:17:57.151637 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:17:57.151611 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc033f7c_d28c_4f77_958a_507fec58c55f.slice/crio-c849a76ae951c06ad9548e4f27843fcdc91c21d9730517c49fdcf6258aa728c6 WatchSource:0}: Error finding container c849a76ae951c06ad9548e4f27843fcdc91c21d9730517c49fdcf6258aa728c6: Status 404 returned error can't find the container with id c849a76ae951c06ad9548e4f27843fcdc91c21d9730517c49fdcf6258aa728c6 Apr 16 18:17:57.157739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:57.157712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" event={"ID":"fc033f7c-d28c-4f77-958a-507fec58c55f","Type":"ContainerStarted","Data":"c849a76ae951c06ad9548e4f27843fcdc91c21d9730517c49fdcf6258aa728c6"} Apr 16 18:17:57.158787 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:17:57.158767 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" event={"ID":"65b59782-5233-4c2d-a0fb-df84a646514b","Type":"ContainerStarted","Data":"30939d448b7752b80308dc0e4cb79a59a736b8c81a8e50e0b3fceb42219dbb54"} Apr 16 18:18:00.168240 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:00.168199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" event={"ID":"65b59782-5233-4c2d-a0fb-df84a646514b","Type":"ContainerStarted","Data":"1dadcde331ed42a46a40ff57160e2813057706cc617e4f31bf1cdb082ada759a"} Apr 16 18:18:00.168617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:00.168383 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:18:00.185721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:00.185654 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" podStartSLOduration=9.506329668 podStartE2EDuration="12.185640246s" podCreationTimestamp="2026-04-16 18:17:48 +0000 UTC" firstStartedPulling="2026-04-16 18:17:56.84917029 +0000 UTC m=+501.773720922" lastFinishedPulling="2026-04-16 18:17:59.528480857 +0000 UTC m=+504.453031500" observedRunningTime="2026-04-16 18:18:00.18380487 +0000 UTC m=+505.108355520" watchObservedRunningTime="2026-04-16 18:18:00.185640246 +0000 UTC m=+505.110190897" Apr 16 18:18:04.182675 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:04.182636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" event={"ID":"fc033f7c-d28c-4f77-958a-507fec58c55f","Type":"ContainerStarted","Data":"570cbf21a6a4009d982ea0626969f9e494f4d22e05ae5d39acacf6d348a93401"} Apr 16 18:18:04.183181 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:04.182747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:18:04.199521 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:04.199458 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" podStartSLOduration=8.781798239 podStartE2EDuration="15.199437709s" podCreationTimestamp="2026-04-16 18:17:49 +0000 UTC" firstStartedPulling="2026-04-16 18:17:57.153113856 +0000 UTC m=+502.077664490" lastFinishedPulling="2026-04-16 18:18:03.570753332 +0000 UTC m=+508.495303960" observedRunningTime="2026-04-16 18:18:04.198524798 +0000 UTC m=+509.123075455" watchObservedRunningTime="2026-04-16 18:18:04.199437709 +0000 UTC m=+509.123988362" Apr 16 18:18:14.151778 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:14.151747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-ntzw8" Apr 16 18:18:15.190759 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:15.190731 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtz7x" Apr 16 18:18:21.174074 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:21.173998 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-zbkvf" Apr 16 18:18:43.499327 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.499293 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t"] Apr 16 18:18:43.505318 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.505297 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.507841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.507819 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.508571 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.508555 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.508659 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.508583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdthj\"" Apr 16 18:18:43.513035 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.513011 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t"] Apr 16 18:18:43.577672 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.577637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.577869 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.577736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.577869 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.577764 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kfz\" (UniqueName: \"kubernetes.io/projected/211fa812-b011-4232-a142-859827e6466e-kube-api-access-k6kfz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.679049 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.679016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.679232 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.679059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6kfz\" (UniqueName: \"kubernetes.io/projected/211fa812-b011-4232-a142-859827e6466e-kube-api-access-k6kfz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.679232 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.679109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.679475 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.679454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.679514 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.679464 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.689199 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.689170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6kfz\" (UniqueName: \"kubernetes.io/projected/211fa812-b011-4232-a142-859827e6466e-kube-api-access-k6kfz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.814656 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.814566 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:43.957772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:43.957740 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t"] Apr 16 18:18:43.960898 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:18:43.960870 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211fa812_b011_4232_a142_859827e6466e.slice/crio-4d1133708883dad248cde1c767c23e80684cd73ed529c19bd839c0f58b32ea64 WatchSource:0}: Error finding container 4d1133708883dad248cde1c767c23e80684cd73ed529c19bd839c0f58b32ea64: Status 404 returned error can't find the container with id 4d1133708883dad248cde1c767c23e80684cd73ed529c19bd839c0f58b32ea64 Apr 16 18:18:44.300860 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:44.300827 2571 generic.go:358] "Generic (PLEG): container finished" podID="211fa812-b011-4232-a142-859827e6466e" containerID="43a53fae856b247ef196ec88df6a5d1302b7d2a21b8368816b515fbe75ad34cd" exitCode=0 Apr 16 18:18:44.301037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:44.300868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" event={"ID":"211fa812-b011-4232-a142-859827e6466e","Type":"ContainerDied","Data":"43a53fae856b247ef196ec88df6a5d1302b7d2a21b8368816b515fbe75ad34cd"} Apr 16 18:18:44.301037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:44.300904 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" event={"ID":"211fa812-b011-4232-a142-859827e6466e","Type":"ContainerStarted","Data":"4d1133708883dad248cde1c767c23e80684cd73ed529c19bd839c0f58b32ea64"} Apr 16 18:18:45.305735 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:45.305631 2571 generic.go:358] "Generic (PLEG): container finished" podID="211fa812-b011-4232-a142-859827e6466e" containerID="562c57cbac66f625aed1f470efbdb1c0c4d62bebfeb9990b98c3120e54a4f4ab" exitCode=0 Apr 16 18:18:45.306153 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:45.305726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" event={"ID":"211fa812-b011-4232-a142-859827e6466e","Type":"ContainerDied","Data":"562c57cbac66f625aed1f470efbdb1c0c4d62bebfeb9990b98c3120e54a4f4ab"} Apr 16 18:18:46.310209 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:46.310172 2571 generic.go:358] "Generic (PLEG): container finished" podID="211fa812-b011-4232-a142-859827e6466e" containerID="695260495e3ed24f1856fcdf2e5c78c74eda2b74e3f55252013e8c409dda4ba0" exitCode=0 Apr 16 18:18:46.310586 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:46.310287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" event={"ID":"211fa812-b011-4232-a142-859827e6466e","Type":"ContainerDied","Data":"695260495e3ed24f1856fcdf2e5c78c74eda2b74e3f55252013e8c409dda4ba0"} Apr 16 18:18:47.437899 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.437876 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:47.509080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.509044 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6kfz\" (UniqueName: \"kubernetes.io/projected/211fa812-b011-4232-a142-859827e6466e-kube-api-access-k6kfz\") pod \"211fa812-b011-4232-a142-859827e6466e\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " Apr 16 18:18:47.509239 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.509093 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-util\") pod \"211fa812-b011-4232-a142-859827e6466e\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " Apr 16 18:18:47.509239 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.509211 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-bundle\") pod \"211fa812-b011-4232-a142-859827e6466e\" (UID: \"211fa812-b011-4232-a142-859827e6466e\") " Apr 16 18:18:47.509815 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.509794 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-bundle" (OuterVolumeSpecName: "bundle") pod "211fa812-b011-4232-a142-859827e6466e" (UID: "211fa812-b011-4232-a142-859827e6466e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:18:47.511225 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.511199 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211fa812-b011-4232-a142-859827e6466e-kube-api-access-k6kfz" (OuterVolumeSpecName: "kube-api-access-k6kfz") pod "211fa812-b011-4232-a142-859827e6466e" (UID: "211fa812-b011-4232-a142-859827e6466e"). InnerVolumeSpecName "kube-api-access-k6kfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:18:47.514496 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.514476 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-util" (OuterVolumeSpecName: "util") pod "211fa812-b011-4232-a142-859827e6466e" (UID: "211fa812-b011-4232-a142-859827e6466e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:18:47.610309 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.610232 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:18:47.610309 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.610256 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k6kfz\" (UniqueName: \"kubernetes.io/projected/211fa812-b011-4232-a142-859827e6466e-kube-api-access-k6kfz\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:18:47.610309 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:47.610267 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/211fa812-b011-4232-a142-859827e6466e-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:18:48.318739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:48.318703 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" event={"ID":"211fa812-b011-4232-a142-859827e6466e","Type":"ContainerDied","Data":"4d1133708883dad248cde1c767c23e80684cd73ed529c19bd839c0f58b32ea64"} Apr 16 18:18:48.318739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:48.318740 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1133708883dad248cde1c767c23e80684cd73ed529c19bd839c0f58b32ea64" Apr 16 18:18:48.318739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:48.318710 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fp74t" Apr 16 18:18:50.696514 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696482 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx"] Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696813 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="util" Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696823 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="util" Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696833 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="pull" Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696838 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="pull" Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696846 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="extract" Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696851 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="extract" Apr 16 18:18:50.696928 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.696914 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="211fa812-b011-4232-a142-859827e6466e" containerName="extract" Apr 16 18:18:50.699812 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.699794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:50.701992 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.701962 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 18:18:50.702129 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.702047 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:50.702129 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.702046 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-cf4p8\"" Apr 16 18:18:50.712466 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.712442 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx"] Apr 16 18:18:50.738001 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.737973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/077f8953-b5fa-43df-b690-0477cba13091-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dlmgx\" (UID: \"077f8953-b5fa-43df-b690-0477cba13091\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:50.738141 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.738034 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7z7\" (UniqueName: \"kubernetes.io/projected/077f8953-b5fa-43df-b690-0477cba13091-kube-api-access-7t7z7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dlmgx\" (UID: \"077f8953-b5fa-43df-b690-0477cba13091\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:50.838848 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.838803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7z7\" (UniqueName: \"kubernetes.io/projected/077f8953-b5fa-43df-b690-0477cba13091-kube-api-access-7t7z7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dlmgx\" (UID: \"077f8953-b5fa-43df-b690-0477cba13091\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:50.838999 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.838875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/077f8953-b5fa-43df-b690-0477cba13091-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dlmgx\" (UID: \"077f8953-b5fa-43df-b690-0477cba13091\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:50.839209 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.839194 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/077f8953-b5fa-43df-b690-0477cba13091-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dlmgx\" (UID: \"077f8953-b5fa-43df-b690-0477cba13091\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:50.847945 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:50.847915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7z7\" (UniqueName: \"kubernetes.io/projected/077f8953-b5fa-43df-b690-0477cba13091-kube-api-access-7t7z7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-dlmgx\" (UID: \"077f8953-b5fa-43df-b690-0477cba13091\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:51.008654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:51.008558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" Apr 16 18:18:51.137329 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:51.137307 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx"] Apr 16 18:18:51.139648 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:18:51.139620 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077f8953_b5fa_43df_b690_0477cba13091.slice/crio-2c9a1ba123cdb08c03a2e1260fbdafa268b6d708d31e85604edf344d162ad87d WatchSource:0}: Error finding container 2c9a1ba123cdb08c03a2e1260fbdafa268b6d708d31e85604edf344d162ad87d: Status 404 returned error can't find the container with id 2c9a1ba123cdb08c03a2e1260fbdafa268b6d708d31e85604edf344d162ad87d Apr 16 18:18:51.332259 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:51.332163 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" event={"ID":"077f8953-b5fa-43df-b690-0477cba13091","Type":"ContainerStarted","Data":"2c9a1ba123cdb08c03a2e1260fbdafa268b6d708d31e85604edf344d162ad87d"} Apr 16 18:18:54.343837 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:54.343798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" event={"ID":"077f8953-b5fa-43df-b690-0477cba13091","Type":"ContainerStarted","Data":"d0ef9c4f4397f135173be29317e39efbbe6a5ce29054a8b1490508ef3790278d"} Apr 16 18:18:54.367853 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:18:54.367803 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-dlmgx" podStartSLOduration=1.6024635379999999 podStartE2EDuration="4.367786478s" podCreationTimestamp="2026-04-16 18:18:50 +0000 UTC" firstStartedPulling="2026-04-16 18:18:51.141945566 +0000 UTC m=+556.066496198" lastFinishedPulling="2026-04-16 18:18:53.907268505 +0000 UTC m=+558.831819138" observedRunningTime="2026-04-16 18:18:54.364821112 +0000 UTC m=+559.289371762" watchObservedRunningTime="2026-04-16 18:18:54.367786478 +0000 UTC m=+559.292337128" Apr 16 18:19:00.566483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.566440 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq"] Apr 16 18:19:00.570244 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.570221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.572429 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.572407 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:19:00.572527 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.572408 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:19:00.573102 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.573087 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdthj\"" Apr 16 18:19:00.585116 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.585086 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq"] Apr 16 18:19:00.622963 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.622913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkt9l\" (UniqueName: \"kubernetes.io/projected/acd6c162-c09c-47d6-aedf-5996fe83d3d3-kube-api-access-wkt9l\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.622963 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.622970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.623179 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.622990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.723868 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.723826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkt9l\" (UniqueName: \"kubernetes.io/projected/acd6c162-c09c-47d6-aedf-5996fe83d3d3-kube-api-access-wkt9l\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.724061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.723880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.724061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.723908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.724272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.724248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.724347 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.724302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.733420 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.733399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkt9l\" (UniqueName: \"kubernetes.io/projected/acd6c162-c09c-47d6-aedf-5996fe83d3d3-kube-api-access-wkt9l\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:00.887861 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:00.887774 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:01.013516 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:01.013483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq"] Apr 16 18:19:01.016286 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:19:01.016253 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacd6c162_c09c_47d6_aedf_5996fe83d3d3.slice/crio-ccea25bc95ccb73bca207a116aac8c43992954dfb9cec7005bb92794caf52a2c WatchSource:0}: Error finding container ccea25bc95ccb73bca207a116aac8c43992954dfb9cec7005bb92794caf52a2c: Status 404 returned error can't find the container with id ccea25bc95ccb73bca207a116aac8c43992954dfb9cec7005bb92794caf52a2c Apr 16 18:19:01.368298 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:01.368263 2571 generic.go:358] "Generic (PLEG): container finished" podID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerID="c79ce6584bc6762073b52995da362f50d2e2e7685ab7ce5088da93b8746e94f4" exitCode=0 Apr 16 18:19:01.368472 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:01.368309 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" event={"ID":"acd6c162-c09c-47d6-aedf-5996fe83d3d3","Type":"ContainerDied","Data":"c79ce6584bc6762073b52995da362f50d2e2e7685ab7ce5088da93b8746e94f4"} Apr 16 18:19:01.368472 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:01.368329 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" event={"ID":"acd6c162-c09c-47d6-aedf-5996fe83d3d3","Type":"ContainerStarted","Data":"ccea25bc95ccb73bca207a116aac8c43992954dfb9cec7005bb92794caf52a2c"} Apr 16 18:19:04.380262 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:04.380228 2571 generic.go:358] "Generic (PLEG): container finished" podID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerID="78552813b9f86ddff83280707084db91f9c3b9d1fb7586458939f98d03c1e07e" exitCode=0 Apr 16 18:19:04.380614 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:04.380298 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" event={"ID":"acd6c162-c09c-47d6-aedf-5996fe83d3d3","Type":"ContainerDied","Data":"78552813b9f86ddff83280707084db91f9c3b9d1fb7586458939f98d03c1e07e"} Apr 16 18:19:05.386109 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:05.386075 2571 generic.go:358] "Generic (PLEG): container finished" podID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerID="bcab8175eea4cb28242114a08abceeddfaad3dafdc49cbd1e21bb1f90c5dd2a8" exitCode=0 Apr 16 18:19:05.386462 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:05.386140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" event={"ID":"acd6c162-c09c-47d6-aedf-5996fe83d3d3","Type":"ContainerDied","Data":"bcab8175eea4cb28242114a08abceeddfaad3dafdc49cbd1e21bb1f90c5dd2a8"} Apr 16 18:19:06.513959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.513935 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:06.578338 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.578296 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-util\") pod \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " Apr 16 18:19:06.578505 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.578349 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkt9l\" (UniqueName: \"kubernetes.io/projected/acd6c162-c09c-47d6-aedf-5996fe83d3d3-kube-api-access-wkt9l\") pod \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " Apr 16 18:19:06.578505 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.578403 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-bundle\") pod \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\" (UID: \"acd6c162-c09c-47d6-aedf-5996fe83d3d3\") " Apr 16 18:19:06.578872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.578849 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-bundle" (OuterVolumeSpecName: "bundle") pod "acd6c162-c09c-47d6-aedf-5996fe83d3d3" (UID: "acd6c162-c09c-47d6-aedf-5996fe83d3d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:06.580596 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.580569 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd6c162-c09c-47d6-aedf-5996fe83d3d3-kube-api-access-wkt9l" (OuterVolumeSpecName: "kube-api-access-wkt9l") pod "acd6c162-c09c-47d6-aedf-5996fe83d3d3" (UID: "acd6c162-c09c-47d6-aedf-5996fe83d3d3"). InnerVolumeSpecName "kube-api-access-wkt9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:06.583164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.583120 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-util" (OuterVolumeSpecName: "util") pod "acd6c162-c09c-47d6-aedf-5996fe83d3d3" (UID: "acd6c162-c09c-47d6-aedf-5996fe83d3d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:06.679589 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.679482 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkt9l\" (UniqueName: \"kubernetes.io/projected/acd6c162-c09c-47d6-aedf-5996fe83d3d3-kube-api-access-wkt9l\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:06.679589 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.679530 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:06.679589 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:06.679540 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/acd6c162-c09c-47d6-aedf-5996fe83d3d3-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:07.395112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:07.395077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" event={"ID":"acd6c162-c09c-47d6-aedf-5996fe83d3d3","Type":"ContainerDied","Data":"ccea25bc95ccb73bca207a116aac8c43992954dfb9cec7005bb92794caf52a2c"} Apr 16 18:19:07.395112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:07.395110 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccea25bc95ccb73bca207a116aac8c43992954dfb9cec7005bb92794caf52a2c" Apr 16 18:19:07.395112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:07.395111 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f6rslq" Apr 16 18:19:13.543018 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.542987 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cbbf69b9b-shr2f"] Apr 16 18:19:13.543364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543303 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="pull" Apr 16 18:19:13.543364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543315 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="pull" Apr 16 18:19:13.543364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543334 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="util" Apr 16 18:19:13.543364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543339 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="util" Apr 16 18:19:13.543364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543345 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="extract" Apr 16 18:19:13.543364 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543350 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="extract" Apr 16 18:19:13.543542 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.543430 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="acd6c162-c09c-47d6-aedf-5996fe83d3d3" containerName="extract" Apr 16 18:19:13.549045 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.549021 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.559882 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.559858 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cbbf69b9b-shr2f"] Apr 16 18:19:13.639178 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c23d52c-1bac-4e48-b575-90a96e26c043-console-serving-cert\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.639178 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-trusted-ca-bundle\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.639418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639200 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c23d52c-1bac-4e48-b575-90a96e26c043-console-oauth-config\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.639418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639307 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-console-config\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.639418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639332 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-service-ca\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.639418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-oauth-serving-cert\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.639654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.639477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hjcl\" (UniqueName: \"kubernetes.io/projected/9c23d52c-1bac-4e48-b575-90a96e26c043-kube-api-access-2hjcl\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-console-config\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-service-ca\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740546 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-oauth-serving-cert\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740546 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hjcl\" (UniqueName: \"kubernetes.io/projected/9c23d52c-1bac-4e48-b575-90a96e26c043-kube-api-access-2hjcl\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740546 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c23d52c-1bac-4e48-b575-90a96e26c043-console-serving-cert\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740546 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-trusted-ca-bundle\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.740546 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.740407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c23d52c-1bac-4e48-b575-90a96e26c043-console-oauth-config\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.741135 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.741099 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-console-config\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.741261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.741198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-service-ca\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.741315 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.741295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-oauth-serving-cert\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.741376 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.741359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c23d52c-1bac-4e48-b575-90a96e26c043-trusted-ca-bundle\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.742936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.742901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c23d52c-1bac-4e48-b575-90a96e26c043-console-serving-cert\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.742936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.742929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c23d52c-1bac-4e48-b575-90a96e26c043-console-oauth-config\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.751914 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.751893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hjcl\" (UniqueName: \"kubernetes.io/projected/9c23d52c-1bac-4e48-b575-90a96e26c043-kube-api-access-2hjcl\") pod \"console-7cbbf69b9b-shr2f\" (UID: \"9c23d52c-1bac-4e48-b575-90a96e26c043\") " pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.859168 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.859078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:13.987112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:13.987087 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cbbf69b9b-shr2f"] Apr 16 18:19:13.989476 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:19:13.989449 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c23d52c_1bac_4e48_b575_90a96e26c043.slice/crio-798c666fc2ff01b73ab1b3b869f84548382986443ae383d595e65649bf3fc1e3 WatchSource:0}: Error finding container 798c666fc2ff01b73ab1b3b869f84548382986443ae383d595e65649bf3fc1e3: Status 404 returned error can't find the container with id 798c666fc2ff01b73ab1b3b869f84548382986443ae383d595e65649bf3fc1e3 Apr 16 18:19:14.013861 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.013835 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8"] Apr 16 18:19:14.019028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.019012 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.021486 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.021462 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:14.021597 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.021504 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:14.021597 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.021520 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-q28ct\"" Apr 16 18:19:14.025355 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.025334 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8"] Apr 16 18:19:14.143232 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.143144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnk7\" (UniqueName: \"kubernetes.io/projected/b10ddccd-bd2f-41fe-a8f1-681e1f0030b3-kube-api-access-hfnk7\") pod \"openshift-lws-operator-bfc7f696d-mtzl8\" (UID: \"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.143232 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.143197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b10ddccd-bd2f-41fe-a8f1-681e1f0030b3-tmp\") pod \"openshift-lws-operator-bfc7f696d-mtzl8\" (UID: \"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.243704 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.243660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b10ddccd-bd2f-41fe-a8f1-681e1f0030b3-tmp\") pod \"openshift-lws-operator-bfc7f696d-mtzl8\" (UID: \"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.243944 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.243765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnk7\" (UniqueName: \"kubernetes.io/projected/b10ddccd-bd2f-41fe-a8f1-681e1f0030b3-kube-api-access-hfnk7\") pod \"openshift-lws-operator-bfc7f696d-mtzl8\" (UID: \"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.244074 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.244053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b10ddccd-bd2f-41fe-a8f1-681e1f0030b3-tmp\") pod \"openshift-lws-operator-bfc7f696d-mtzl8\" (UID: \"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.252931 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.252903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnk7\" (UniqueName: \"kubernetes.io/projected/b10ddccd-bd2f-41fe-a8f1-681e1f0030b3-kube-api-access-hfnk7\") pod \"openshift-lws-operator-bfc7f696d-mtzl8\" (UID: \"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.369228 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.369174 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" Apr 16 18:19:14.419698 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.419650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cbbf69b9b-shr2f" event={"ID":"9c23d52c-1bac-4e48-b575-90a96e26c043","Type":"ContainerStarted","Data":"a496822523c3fb643d3d1d3efa3dee5bffc1f09a1280a408bb87bc752c4547b9"} Apr 16 18:19:14.419841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.419723 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cbbf69b9b-shr2f" event={"ID":"9c23d52c-1bac-4e48-b575-90a96e26c043","Type":"ContainerStarted","Data":"798c666fc2ff01b73ab1b3b869f84548382986443ae383d595e65649bf3fc1e3"} Apr 16 18:19:14.437279 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.437134 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cbbf69b9b-shr2f" podStartSLOduration=1.437114237 podStartE2EDuration="1.437114237s" podCreationTimestamp="2026-04-16 18:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:14.436653588 +0000 UTC m=+579.361204254" watchObservedRunningTime="2026-04-16 18:19:14.437114237 +0000 UTC m=+579.361664888" Apr 16 18:19:14.494772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:14.494743 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8"] Apr 16 18:19:14.497268 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:19:14.497241 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb10ddccd_bd2f_41fe_a8f1_681e1f0030b3.slice/crio-e614eb26118a94c897b20297a49f747c3021918a009af7954984532c3bdb0a5f WatchSource:0}: Error finding container e614eb26118a94c897b20297a49f747c3021918a009af7954984532c3bdb0a5f: Status 404 returned error can't find the container with id e614eb26118a94c897b20297a49f747c3021918a009af7954984532c3bdb0a5f Apr 16 18:19:15.425077 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:15.425028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" event={"ID":"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3","Type":"ContainerStarted","Data":"e614eb26118a94c897b20297a49f747c3021918a009af7954984532c3bdb0a5f"} Apr 16 18:19:16.429922 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:16.429839 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" event={"ID":"b10ddccd-bd2f-41fe-a8f1-681e1f0030b3","Type":"ContainerStarted","Data":"142f7c96645de539fd899a36e027423836e7f3a7ed0b0dcca94e3d45bbb76ab7"} Apr 16 18:19:16.450391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:16.450344 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-mtzl8" podStartSLOduration=1.8289418259999999 podStartE2EDuration="3.450329263s" podCreationTimestamp="2026-04-16 18:19:13 +0000 UTC" firstStartedPulling="2026-04-16 18:19:14.498750611 +0000 UTC m=+579.423301243" lastFinishedPulling="2026-04-16 18:19:16.120138047 +0000 UTC m=+581.044688680" observedRunningTime="2026-04-16 18:19:16.449616448 +0000 UTC m=+581.374167100" watchObservedRunningTime="2026-04-16 18:19:16.450329263 +0000 UTC m=+581.374879914" Apr 16 18:19:23.859201 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:23.859167 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:23.859755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:23.859235 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:23.868892 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:23.864732 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:24.459324 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:24.459296 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cbbf69b9b-shr2f" Apr 16 18:19:24.508517 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:24.508482 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-87977c59d-f5s79"] Apr 16 18:19:28.398665 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.398631 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6"] Apr 16 18:19:28.402277 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.402261 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.404862 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.404840 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:19:28.404954 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.404845 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdthj\"" Apr 16 18:19:28.405405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.405384 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:19:28.411141 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.411119 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6"] Apr 16 18:19:28.570100 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.570061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jh9\" (UniqueName: \"kubernetes.io/projected/fe3adcc2-b5d5-457a-8540-c677a796b6a4-kube-api-access-s2jh9\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.570343 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.570126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.570343 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.570202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.671593 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.671514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.671593 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.671553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.671802 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.671605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jh9\" (UniqueName: \"kubernetes.io/projected/fe3adcc2-b5d5-457a-8540-c677a796b6a4-kube-api-access-s2jh9\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.671902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.671883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.671996 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.671974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.680475 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.680454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jh9\" (UniqueName: \"kubernetes.io/projected/fe3adcc2-b5d5-457a-8540-c677a796b6a4-kube-api-access-s2jh9\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.712941 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.712914 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:28.839183 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:28.839147 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6"] Apr 16 18:19:28.841329 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:19:28.841297 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3adcc2_b5d5_457a_8540_c677a796b6a4.slice/crio-5edf3c32bbb1a9dc3040fff5d8111eb25e4e3ee277770fd35683d1b67cc1a078 WatchSource:0}: Error finding container 5edf3c32bbb1a9dc3040fff5d8111eb25e4e3ee277770fd35683d1b67cc1a078: Status 404 returned error can't find the container with id 5edf3c32bbb1a9dc3040fff5d8111eb25e4e3ee277770fd35683d1b67cc1a078 Apr 16 18:19:29.475099 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:29.475059 2571 generic.go:358] "Generic (PLEG): container finished" podID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerID="c82777ce61f32ffb014ccef6a4079b71fc42f7a3ddc946a8be1e8b1a961ccc37" exitCode=0 Apr 16 18:19:29.475464 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:29.475136 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" event={"ID":"fe3adcc2-b5d5-457a-8540-c677a796b6a4","Type":"ContainerDied","Data":"c82777ce61f32ffb014ccef6a4079b71fc42f7a3ddc946a8be1e8b1a961ccc37"} Apr 16 18:19:29.475464 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:29.475169 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" event={"ID":"fe3adcc2-b5d5-457a-8540-c677a796b6a4","Type":"ContainerStarted","Data":"5edf3c32bbb1a9dc3040fff5d8111eb25e4e3ee277770fd35683d1b67cc1a078"} Apr 16 18:19:30.479883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:30.479853 2571 generic.go:358] "Generic (PLEG): container finished" podID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerID="db596f73bd86ed9aa9d60c981567ee42a772a4a0d5860b6960a5dd825bc94708" exitCode=0 Apr 16 18:19:30.480261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:30.479900 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" event={"ID":"fe3adcc2-b5d5-457a-8540-c677a796b6a4","Type":"ContainerDied","Data":"db596f73bd86ed9aa9d60c981567ee42a772a4a0d5860b6960a5dd825bc94708"} Apr 16 18:19:31.486208 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:31.486172 2571 generic.go:358] "Generic (PLEG): container finished" podID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerID="5119af54959adaec22f3cbe9eaf02672000b848c1f3626ea115ceb9c9d99ffc2" exitCode=0 Apr 16 18:19:31.486584 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:31.486231 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" event={"ID":"fe3adcc2-b5d5-457a-8540-c677a796b6a4","Type":"ContainerDied","Data":"5119af54959adaec22f3cbe9eaf02672000b848c1f3626ea115ceb9c9d99ffc2"} Apr 16 18:19:32.613873 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.613850 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:32.708867 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.708833 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-util\") pod \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " Apr 16 18:19:32.709028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.708876 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-bundle\") pod \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " Apr 16 18:19:32.709028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.708919 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2jh9\" (UniqueName: \"kubernetes.io/projected/fe3adcc2-b5d5-457a-8540-c677a796b6a4-kube-api-access-s2jh9\") pod \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\" (UID: \"fe3adcc2-b5d5-457a-8540-c677a796b6a4\") " Apr 16 18:19:32.709930 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.709901 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-bundle" (OuterVolumeSpecName: "bundle") pod "fe3adcc2-b5d5-457a-8540-c677a796b6a4" (UID: "fe3adcc2-b5d5-457a-8540-c677a796b6a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:32.711044 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.711021 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3adcc2-b5d5-457a-8540-c677a796b6a4-kube-api-access-s2jh9" (OuterVolumeSpecName: "kube-api-access-s2jh9") pod "fe3adcc2-b5d5-457a-8540-c677a796b6a4" (UID: "fe3adcc2-b5d5-457a-8540-c677a796b6a4"). InnerVolumeSpecName "kube-api-access-s2jh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:32.714177 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.714155 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-util" (OuterVolumeSpecName: "util") pod "fe3adcc2-b5d5-457a-8540-c677a796b6a4" (UID: "fe3adcc2-b5d5-457a-8540-c677a796b6a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:32.809680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.809576 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:32.809680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.809628 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3adcc2-b5d5-457a-8540-c677a796b6a4-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:32.809680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:32.809640 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2jh9\" (UniqueName: \"kubernetes.io/projected/fe3adcc2-b5d5-457a-8540-c677a796b6a4-kube-api-access-s2jh9\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:33.494235 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:33.494205 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" Apr 16 18:19:33.494381 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:33.494205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835p7vt6" event={"ID":"fe3adcc2-b5d5-457a-8540-c677a796b6a4","Type":"ContainerDied","Data":"5edf3c32bbb1a9dc3040fff5d8111eb25e4e3ee277770fd35683d1b67cc1a078"} Apr 16 18:19:33.494381 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:33.494315 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5edf3c32bbb1a9dc3040fff5d8111eb25e4e3ee277770fd35683d1b67cc1a078" Apr 16 18:19:35.509972 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:35.509943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:19:35.510754 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:35.510732 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:19:43.322576 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322538 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s"] Apr 16 18:19:43.322985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322907 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="pull" Apr 16 18:19:43.322985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322923 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="pull" Apr 16 18:19:43.322985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322941 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="extract" Apr 16 18:19:43.322985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322950 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="extract" Apr 16 18:19:43.322985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322967 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="util" Apr 16 18:19:43.322985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.322973 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="util" Apr 16 18:19:43.323217 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.323034 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe3adcc2-b5d5-457a-8540-c677a796b6a4" containerName="extract" Apr 16 18:19:43.327448 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.327431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.330040 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.330019 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:19:43.330176 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.330025 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdthj\"" Apr 16 18:19:43.330809 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.330792 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:19:43.337605 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.337586 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s"] Apr 16 18:19:43.394554 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.394517 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ljx\" (UniqueName: \"kubernetes.io/projected/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-kube-api-access-d5ljx\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.394740 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.394559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.394740 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.394642 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.495493 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.495452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ljx\" (UniqueName: \"kubernetes.io/projected/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-kube-api-access-d5ljx\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.495493 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.495492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.495679 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.495551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.495906 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.495890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.495983 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.495963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.511043 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.511009 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ljx\" (UniqueName: \"kubernetes.io/projected/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-kube-api-access-d5ljx\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.637791 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.637665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:43.763505 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:43.763436 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s"] Apr 16 18:19:43.766220 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:19:43.766191 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bee9d62_a1b7_4c1f_9ecd_e392f0f4cb80.slice/crio-9975dcd080652c8772444ab5e898ecf2cb2bb84448696f0d874a5847bece0785 WatchSource:0}: Error finding container 9975dcd080652c8772444ab5e898ecf2cb2bb84448696f0d874a5847bece0785: Status 404 returned error can't find the container with id 9975dcd080652c8772444ab5e898ecf2cb2bb84448696f0d874a5847bece0785 Apr 16 18:19:44.540849 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:44.540816 2571 generic.go:358] "Generic (PLEG): container finished" podID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerID="5a1c5175cd4fd672a016612486850357798b9362777eb3604747fa7e752f0685" exitCode=0 Apr 16 18:19:44.541253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:44.540906 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" event={"ID":"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80","Type":"ContainerDied","Data":"5a1c5175cd4fd672a016612486850357798b9362777eb3604747fa7e752f0685"} Apr 16 18:19:44.541253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:44.540955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" event={"ID":"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80","Type":"ContainerStarted","Data":"9975dcd080652c8772444ab5e898ecf2cb2bb84448696f0d874a5847bece0785"} Apr 16 18:19:45.470315 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.470282 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-xklxt"] Apr 16 18:19:45.473872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.473854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.476348 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.476315 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 18:19:45.476348 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.476325 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 18:19:45.476504 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.476401 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-m5kxn\"" Apr 16 18:19:45.492014 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.491983 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-xklxt"] Apr 16 18:19:45.517403 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.517366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/053e8b4c-870f-427c-8a50-bf564020a064-operator-config\") pod \"servicemesh-operator3-55f49c5f94-xklxt\" (UID: \"053e8b4c-870f-427c-8a50-bf564020a064\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.517510 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.517425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xmx\" (UniqueName: \"kubernetes.io/projected/053e8b4c-870f-427c-8a50-bf564020a064-kube-api-access-m6xmx\") pod \"servicemesh-operator3-55f49c5f94-xklxt\" (UID: \"053e8b4c-870f-427c-8a50-bf564020a064\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.618831 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.618797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/053e8b4c-870f-427c-8a50-bf564020a064-operator-config\") pod \"servicemesh-operator3-55f49c5f94-xklxt\" (UID: \"053e8b4c-870f-427c-8a50-bf564020a064\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.619260 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.618838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xmx\" (UniqueName: \"kubernetes.io/projected/053e8b4c-870f-427c-8a50-bf564020a064-kube-api-access-m6xmx\") pod \"servicemesh-operator3-55f49c5f94-xklxt\" (UID: \"053e8b4c-870f-427c-8a50-bf564020a064\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.621368 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.621342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/053e8b4c-870f-427c-8a50-bf564020a064-operator-config\") pod \"servicemesh-operator3-55f49c5f94-xklxt\" (UID: \"053e8b4c-870f-427c-8a50-bf564020a064\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.630001 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.629972 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xmx\" (UniqueName: \"kubernetes.io/projected/053e8b4c-870f-427c-8a50-bf564020a064-kube-api-access-m6xmx\") pod \"servicemesh-operator3-55f49c5f94-xklxt\" (UID: \"053e8b4c-870f-427c-8a50-bf564020a064\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.782941 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.782844 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:45.912028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:45.911900 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-xklxt"] Apr 16 18:19:45.914451 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:19:45.914423 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053e8b4c_870f_427c_8a50_bf564020a064.slice/crio-a72be7cb5114dd02d71d35fa749df08b56eb73b41343c49e9d84dd452981caf8 WatchSource:0}: Error finding container a72be7cb5114dd02d71d35fa749df08b56eb73b41343c49e9d84dd452981caf8: Status 404 returned error can't find the container with id a72be7cb5114dd02d71d35fa749df08b56eb73b41343c49e9d84dd452981caf8 Apr 16 18:19:46.548973 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:46.548944 2571 generic.go:358] "Generic (PLEG): container finished" podID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerID="906ab72a522e049f1e1ed6a7883c69957178e1dcc9473a55c9208d97fd9375d1" exitCode=0 Apr 16 18:19:46.549141 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:46.549030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" event={"ID":"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80","Type":"ContainerDied","Data":"906ab72a522e049f1e1ed6a7883c69957178e1dcc9473a55c9208d97fd9375d1"} Apr 16 18:19:46.550314 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:46.550291 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" event={"ID":"053e8b4c-870f-427c-8a50-bf564020a064","Type":"ContainerStarted","Data":"a72be7cb5114dd02d71d35fa749df08b56eb73b41343c49e9d84dd452981caf8"} Apr 16 18:19:47.556558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:47.556474 2571 generic.go:358] "Generic (PLEG): container finished" podID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerID="df5eb1b58540ddbcfbb74b89cc61a35cf05cca72746f8d429901b725c2efd48d" exitCode=0 Apr 16 18:19:47.556558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:47.556521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" event={"ID":"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80","Type":"ContainerDied","Data":"df5eb1b58540ddbcfbb74b89cc61a35cf05cca72746f8d429901b725c2efd48d"} Apr 16 18:19:48.564492 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.564364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" event={"ID":"053e8b4c-870f-427c-8a50-bf564020a064","Type":"ContainerStarted","Data":"35b4e9ee035ca8598b0578c7b73876e49ede483ea6933b66cf831db8fa5136c6"} Apr 16 18:19:48.564492 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.564458 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:19:48.584577 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.584518 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" podStartSLOduration=1.096595369 podStartE2EDuration="3.584500535s" podCreationTimestamp="2026-04-16 18:19:45 +0000 UTC" firstStartedPulling="2026-04-16 18:19:45.916921759 +0000 UTC m=+610.841472388" lastFinishedPulling="2026-04-16 18:19:48.404826922 +0000 UTC m=+613.329377554" observedRunningTime="2026-04-16 18:19:48.583261419 +0000 UTC m=+613.507812072" watchObservedRunningTime="2026-04-16 18:19:48.584500535 +0000 UTC m=+613.509051187" Apr 16 18:19:48.700965 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.700941 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:48.747004 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.746969 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-bundle\") pod \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " Apr 16 18:19:48.747157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.747090 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ljx\" (UniqueName: \"kubernetes.io/projected/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-kube-api-access-d5ljx\") pod \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " Apr 16 18:19:48.747157 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.747121 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-util\") pod \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\" (UID: \"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80\") " Apr 16 18:19:48.748191 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.748161 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-bundle" (OuterVolumeSpecName: "bundle") pod "7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" (UID: "7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:48.749098 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.749078 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-kube-api-access-d5ljx" (OuterVolumeSpecName: "kube-api-access-d5ljx") pod "7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" (UID: "7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80"). InnerVolumeSpecName "kube-api-access-d5ljx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:48.752478 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.752452 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-util" (OuterVolumeSpecName: "util") pod "7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" (UID: "7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:48.848792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.848745 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5ljx\" (UniqueName: \"kubernetes.io/projected/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-kube-api-access-d5ljx\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:48.848792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.848778 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:48.848792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:48.848789 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.533884 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.533820 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-87977c59d-f5s79" podUID="e1b107aa-f6da-49ce-abb2-9db8f9af18ab" containerName="console" containerID="cri-o://35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d" gracePeriod=15 Apr 16 18:19:49.571560 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.571530 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" Apr 16 18:19:49.571941 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.571529 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2dfd8s" event={"ID":"7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80","Type":"ContainerDied","Data":"9975dcd080652c8772444ab5e898ecf2cb2bb84448696f0d874a5847bece0785"} Apr 16 18:19:49.571941 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.571645 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9975dcd080652c8772444ab5e898ecf2cb2bb84448696f0d874a5847bece0785" Apr 16 18:19:49.769670 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.769644 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87977c59d-f5s79_e1b107aa-f6da-49ce-abb2-9db8f9af18ab/console/0.log" Apr 16 18:19:49.769822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.769722 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:19:49.858119 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858024 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq85j\" (UniqueName: \"kubernetes.io/projected/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-kube-api-access-dq85j\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858119 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858069 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-config\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858119 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858106 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-oauth-serving-cert\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858206 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-service-ca\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858257 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-trusted-ca-bundle\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858309 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-oauth-config\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858358 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-serving-cert\") pod \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\" (UID: \"e1b107aa-f6da-49ce-abb2-9db8f9af18ab\") " Apr 16 18:19:49.858599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858529 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-service-ca" (OuterVolumeSpecName: "service-ca") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:49.858599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858479 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:49.858599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858487 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-config" (OuterVolumeSpecName: "console-config") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:49.858803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858666 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.858803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858699 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-oauth-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.858803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858715 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-service-ca\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.858803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.858709 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:49.860442 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.860418 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:49.860744 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.860720 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:49.860744 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.860738 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-kube-api-access-dq85j" (OuterVolumeSpecName: "kube-api-access-dq85j") pod "e1b107aa-f6da-49ce-abb2-9db8f9af18ab" (UID: "e1b107aa-f6da-49ce-abb2-9db8f9af18ab"). InnerVolumeSpecName "kube-api-access-dq85j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:49.959340 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.959310 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dq85j\" (UniqueName: \"kubernetes.io/projected/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-kube-api-access-dq85j\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.959340 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.959338 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-trusted-ca-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.959536 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.959351 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-oauth-config\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:49.959536 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:49.959363 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b107aa-f6da-49ce-abb2-9db8f9af18ab-console-serving-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:19:50.576316 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.576290 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87977c59d-f5s79_e1b107aa-f6da-49ce-abb2-9db8f9af18ab/console/0.log" Apr 16 18:19:50.576798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.576329 2571 generic.go:358] "Generic (PLEG): container finished" podID="e1b107aa-f6da-49ce-abb2-9db8f9af18ab" containerID="35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d" exitCode=2 Apr 16 18:19:50.576798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.576364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87977c59d-f5s79" event={"ID":"e1b107aa-f6da-49ce-abb2-9db8f9af18ab","Type":"ContainerDied","Data":"35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d"} Apr 16 18:19:50.576798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.576393 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87977c59d-f5s79" Apr 16 18:19:50.576798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.576412 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87977c59d-f5s79" event={"ID":"e1b107aa-f6da-49ce-abb2-9db8f9af18ab","Type":"ContainerDied","Data":"baaa685ba2e756fd3dfb9af21854adb5fe36eda07ff66b0acecb1580c7a0ca91"} Apr 16 18:19:50.576798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.576437 2571 scope.go:117] "RemoveContainer" containerID="35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d" Apr 16 18:19:50.584865 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.584841 2571 scope.go:117] "RemoveContainer" containerID="35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d" Apr 16 18:19:50.585137 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:19:50.585120 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d\": container with ID starting with 35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d not found: ID does not exist" containerID="35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d" Apr 16 18:19:50.585192 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.585147 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d"} err="failed to get container status \"35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d\": rpc error: code = NotFound desc = could not find container \"35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d\": container with ID starting with 35c74af7cbfe76651d17044aed1b65433ca5352c8cc85900a1313cad4a007e4d not found: ID does not exist" Apr 16 18:19:50.601813 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.601783 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-87977c59d-f5s79"] Apr 16 18:19:50.603498 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:50.603478 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-87977c59d-f5s79"] Apr 16 18:19:51.641391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:51.641355 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b107aa-f6da-49ce-abb2-9db8f9af18ab" path="/var/lib/kubelet/pods/e1b107aa-f6da-49ce-abb2-9db8f9af18ab/volumes" Apr 16 18:19:59.573738 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:19:59.573708 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-xklxt" Apr 16 18:20:04.545405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545357 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq"] Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545729 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="util" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545743 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="util" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545764 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="extract" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545769 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="extract" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545778 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="pull" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545783 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="pull" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545791 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1b107aa-f6da-49ce-abb2-9db8f9af18ab" containerName="console" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545796 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b107aa-f6da-49ce-abb2-9db8f9af18ab" containerName="console" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545859 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bee9d62-a1b7-4c1f-9ecd-e392f0f4cb80" containerName="extract" Apr 16 18:20:04.545902 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.545868 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1b107aa-f6da-49ce-abb2-9db8f9af18ab" containerName="console" Apr 16 18:20:04.548445 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.548422 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.552041 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.552012 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 18:20:04.552170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.552079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 18:20:04.552170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.552084 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 18:20:04.552170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.552110 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-zphnw\"" Apr 16 18:20:04.552291 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.552233 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 18:20:04.558165 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.558140 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq"] Apr 16 18:20:04.687868 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.687816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.688061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.687876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.688061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.687985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.688061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.688020 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm8qw\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-kube-api-access-zm8qw\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.688248 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.688069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.688248 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.688162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.688248 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.688188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789124 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789124 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm8qw\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-kube-api-access-zm8qw\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.789385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.790046 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.789968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.791834 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.791800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.791961 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.791830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.791961 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.791949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.792149 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.792129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.798612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.798542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.798822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.798805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm8qw\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-kube-api-access-zm8qw\") pod \"istiod-openshift-gateway-7cd77c7ffd-qvnpq\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.858851 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.858812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:04.996273 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:04.996190 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq"] Apr 16 18:20:04.999089 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:04.999052 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9086ce5_8be0_4692_8e7d_1e66a0200a6e.slice/crio-913010c7bc8852eca942ccee384143bf1171701fffafeb1d9aec87039282b0c4 WatchSource:0}: Error finding container 913010c7bc8852eca942ccee384143bf1171701fffafeb1d9aec87039282b0c4: Status 404 returned error can't find the container with id 913010c7bc8852eca942ccee384143bf1171701fffafeb1d9aec87039282b0c4 Apr 16 18:20:05.635698 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:05.635654 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" event={"ID":"d9086ce5-8be0-4692-8e7d-1e66a0200a6e","Type":"ContainerStarted","Data":"913010c7bc8852eca942ccee384143bf1171701fffafeb1d9aec87039282b0c4"} Apr 16 18:20:08.029665 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:08.029625 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:20:08.030028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:08.029714 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:20:08.648721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:08.648662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" event={"ID":"d9086ce5-8be0-4692-8e7d-1e66a0200a6e","Type":"ContainerStarted","Data":"6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95"} Apr 16 18:20:08.648920 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:08.648747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:08.667016 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:08.666953 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" podStartSLOduration=1.639307066 podStartE2EDuration="4.666936943s" podCreationTimestamp="2026-04-16 18:20:04 +0000 UTC" firstStartedPulling="2026-04-16 18:20:05.001750077 +0000 UTC m=+629.926300724" lastFinishedPulling="2026-04-16 18:20:08.029379969 +0000 UTC m=+632.953930601" observedRunningTime="2026-04-16 18:20:08.665785792 +0000 UTC m=+633.590336444" watchObservedRunningTime="2026-04-16 18:20:08.666936943 +0000 UTC m=+633.591487594" Apr 16 18:20:09.654252 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:09.654223 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:20:10.514018 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.513982 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz"] Apr 16 18:20:10.517087 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.517064 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.519482 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.519448 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-zr759\"" Apr 16 18:20:10.526949 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.526364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz"] Apr 16 18:20:10.640334 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640538 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640538 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640538 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtfxs\" (UniqueName: \"kubernetes.io/projected/839f2fc2-b2e3-439f-b39e-b7a645beaf48-kube-api-access-qtfxs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.640837 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.640700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741359 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtfxs\" (UniqueName: \"kubernetes.io/projected/839f2fc2-b2e3-439f-b39e-b7a645beaf48-kube-api-access-qtfxs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.741872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.742321 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.741885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.742382 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.742328 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.742523 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.742500 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.742573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.742513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.742746 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.742723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.744138 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.744115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.744499 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.744478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.750370 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.750341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/839f2fc2-b2e3-439f-b39e-b7a645beaf48-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.750614 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.750596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtfxs\" (UniqueName: \"kubernetes.io/projected/839f2fc2-b2e3-439f-b39e-b7a645beaf48-kube-api-access-qtfxs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gqksz\" (UID: \"839f2fc2-b2e3-439f-b39e-b7a645beaf48\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.832031 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.831933 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:10.962229 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:10.962200 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz"] Apr 16 18:20:10.964095 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:10.964064 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839f2fc2_b2e3_439f_b39e_b7a645beaf48.slice/crio-05246aa6b4b1524f1cbb4560ad5e3c30690fedc0040da3893ed80dd95d21efb0 WatchSource:0}: Error finding container 05246aa6b4b1524f1cbb4560ad5e3c30690fedc0040da3893ed80dd95d21efb0: Status 404 returned error can't find the container with id 05246aa6b4b1524f1cbb4560ad5e3c30690fedc0040da3893ed80dd95d21efb0 Apr 16 18:20:11.660226 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:11.660189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" event={"ID":"839f2fc2-b2e3-439f-b39e-b7a645beaf48","Type":"ContainerStarted","Data":"05246aa6b4b1524f1cbb4560ad5e3c30690fedc0040da3893ed80dd95d21efb0"} Apr 16 18:20:15.673850 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:15.673813 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:20:15.674160 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:15.673878 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:20:15.674160 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:15.673925 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:20:16.680231 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:16.680198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" event={"ID":"839f2fc2-b2e3-439f-b39e-b7a645beaf48","Type":"ContainerStarted","Data":"85030edf25c7c353108c58c45e9310403ce0e9a140ab2babef04058d1a9c9884"} Apr 16 18:20:16.700393 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:16.700339 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" podStartSLOduration=1.9929564480000002 podStartE2EDuration="6.700322068s" podCreationTimestamp="2026-04-16 18:20:10 +0000 UTC" firstStartedPulling="2026-04-16 18:20:10.966171962 +0000 UTC m=+635.890722608" lastFinishedPulling="2026-04-16 18:20:15.673537585 +0000 UTC m=+640.598088228" observedRunningTime="2026-04-16 18:20:16.699715336 +0000 UTC m=+641.624265984" watchObservedRunningTime="2026-04-16 18:20:16.700322068 +0000 UTC m=+641.624872720" Apr 16 18:20:16.832600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:16.832564 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:16.837244 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:16.837221 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:17.684240 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:17.684206 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:17.685185 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:17.685165 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gqksz" Apr 16 18:20:19.717641 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.717607 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv"] Apr 16 18:20:19.721116 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.721095 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.723295 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.723271 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdthj\"" Apr 16 18:20:19.723392 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.723321 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:20:19.723392 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.723276 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:20:19.729452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.729424 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv"] Apr 16 18:20:19.815262 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.815228 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4"] Apr 16 18:20:19.818775 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.818758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.819529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.819504 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dq57\" (UniqueName: \"kubernetes.io/projected/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-kube-api-access-2dq57\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.819610 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.819539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.819610 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.819567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.819718 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.819698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmxs\" (UniqueName: \"kubernetes.io/projected/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-kube-api-access-ztmxs\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.819773 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.819757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.819829 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.819814 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.826894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.826867 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4"] Apr 16 18:20:19.913282 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.913246 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk"] Apr 16 18:20:19.916730 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.916714 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:19.920167 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.920236 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmxs\" (UniqueName: \"kubernetes.io/projected/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-kube-api-access-ztmxs\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.920236 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.920341 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.920341 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dq57\" (UniqueName: \"kubernetes.io/projected/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-kube-api-access-2dq57\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.920341 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.920664 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920642 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.920664 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.920788 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.920823 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.920788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:19.924220 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.924171 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk"] Apr 16 18:20:19.928392 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.928369 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dq57\" (UniqueName: \"kubernetes.io/projected/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-kube-api-access-2dq57\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:19.928485 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:19.928408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmxs\" (UniqueName: \"kubernetes.io/projected/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-kube-api-access-ztmxs\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:20.018210 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.018123 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm"] Apr 16 18:20:20.020961 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.020937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.021037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.020978 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.021037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.021005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92b86\" (UniqueName: \"kubernetes.io/projected/c117a7cb-e729-40dd-8d73-c555838139f8-kube-api-access-92b86\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.021959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.021945 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.030811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.030777 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm"] Apr 16 18:20:20.030950 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.030851 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:20.122088 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8984\" (UniqueName: \"kubernetes.io/projected/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-kube-api-access-m8984\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.122259 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.122259 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.122372 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.122372 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.122372 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92b86\" (UniqueName: \"kubernetes.io/projected/c117a7cb-e729-40dd-8d73-c555838139f8-kube-api-access-92b86\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.122681 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.122806 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.122767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.128609 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.128586 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:20.131780 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.131758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92b86\" (UniqueName: \"kubernetes.io/projected/c117a7cb-e729-40dd-8d73-c555838139f8-kube-api-access-92b86\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.167983 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.167953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv"] Apr 16 18:20:20.170649 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:20.170589 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a57bf6e_dea4_47e1_83e5_8e41a92a7ede.slice/crio-361f0f8ab442f8285cfb0a1e009f75d8944eb15ab394522266a110446730d9e1 WatchSource:0}: Error finding container 361f0f8ab442f8285cfb0a1e009f75d8944eb15ab394522266a110446730d9e1: Status 404 returned error can't find the container with id 361f0f8ab442f8285cfb0a1e009f75d8944eb15ab394522266a110446730d9e1 Apr 16 18:20:20.223321 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.223289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8984\" (UniqueName: \"kubernetes.io/projected/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-kube-api-access-m8984\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.223460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.223377 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.223460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.223409 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.223926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.223832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.223926 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.223879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.227119 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.227101 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:20.233383 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.233329 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8984\" (UniqueName: \"kubernetes.io/projected/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-kube-api-access-m8984\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.265705 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.265662 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4"] Apr 16 18:20:20.269222 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:20.269150 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50585f9d_aa82_477c_a6f1_e3c0619b1eb5.slice/crio-a29f213ebd98b5a7bae5f4c8d403839d6200cb91da8c71ac6cb7bfc3759cbb70 WatchSource:0}: Error finding container a29f213ebd98b5a7bae5f4c8d403839d6200cb91da8c71ac6cb7bfc3759cbb70: Status 404 returned error can't find the container with id a29f213ebd98b5a7bae5f4c8d403839d6200cb91da8c71ac6cb7bfc3759cbb70 Apr 16 18:20:20.332756 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.332724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:20.361458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.361434 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk"] Apr 16 18:20:20.372406 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:20.372377 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc117a7cb_e729_40dd_8d73_c555838139f8.slice/crio-70c7e8b3053581d9422894c8503f7382ce74bb9ada9bb1d72c132a09e49e5326 WatchSource:0}: Error finding container 70c7e8b3053581d9422894c8503f7382ce74bb9ada9bb1d72c132a09e49e5326: Status 404 returned error can't find the container with id 70c7e8b3053581d9422894c8503f7382ce74bb9ada9bb1d72c132a09e49e5326 Apr 16 18:20:20.468922 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.468895 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm"] Apr 16 18:20:20.546637 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:20.546561 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ba3bbd_0686_4c91_a4dc_ff6e1a1a7be8.slice/crio-a24ab7bb47f0db4558537a0c7bbd15fb3309cd15a2b6562891553c05aaeac38e WatchSource:0}: Error finding container a24ab7bb47f0db4558537a0c7bbd15fb3309cd15a2b6562891553c05aaeac38e: Status 404 returned error can't find the container with id a24ab7bb47f0db4558537a0c7bbd15fb3309cd15a2b6562891553c05aaeac38e Apr 16 18:20:20.695866 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.695824 2571 generic.go:358] "Generic (PLEG): container finished" podID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerID="b646a2e5a5b52a54f573627ef13e5ab3805e4bb411426492a85634db339f6c64" exitCode=0 Apr 16 18:20:20.696043 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.695913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" event={"ID":"50585f9d-aa82-477c-a6f1-e3c0619b1eb5","Type":"ContainerDied","Data":"b646a2e5a5b52a54f573627ef13e5ab3805e4bb411426492a85634db339f6c64"} Apr 16 18:20:20.696043 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.695948 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" event={"ID":"50585f9d-aa82-477c-a6f1-e3c0619b1eb5","Type":"ContainerStarted","Data":"a29f213ebd98b5a7bae5f4c8d403839d6200cb91da8c71ac6cb7bfc3759cbb70"} Apr 16 18:20:20.697377 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.697306 2571 generic.go:358] "Generic (PLEG): container finished" podID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerID="d40457d4e62f1a1486a2bc8b60ba5d28d03e7645dea570bfbb26bcdd95dc2ef0" exitCode=0 Apr 16 18:20:20.697460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.697382 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" event={"ID":"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede","Type":"ContainerDied","Data":"d40457d4e62f1a1486a2bc8b60ba5d28d03e7645dea570bfbb26bcdd95dc2ef0"} Apr 16 18:20:20.697460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.697411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" event={"ID":"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede","Type":"ContainerStarted","Data":"361f0f8ab442f8285cfb0a1e009f75d8944eb15ab394522266a110446730d9e1"} Apr 16 18:20:20.698841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.698821 2571 generic.go:358] "Generic (PLEG): container finished" podID="c117a7cb-e729-40dd-8d73-c555838139f8" containerID="04a30849e212f27c97d8ab7a7be5c69c0c02bd9464261f9ee87ee7ef6f04c76e" exitCode=0 Apr 16 18:20:20.698911 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.698883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" event={"ID":"c117a7cb-e729-40dd-8d73-c555838139f8","Type":"ContainerDied","Data":"04a30849e212f27c97d8ab7a7be5c69c0c02bd9464261f9ee87ee7ef6f04c76e"} Apr 16 18:20:20.698911 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.698899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" event={"ID":"c117a7cb-e729-40dd-8d73-c555838139f8","Type":"ContainerStarted","Data":"70c7e8b3053581d9422894c8503f7382ce74bb9ada9bb1d72c132a09e49e5326"} Apr 16 18:20:20.700416 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.700394 2571 generic.go:358] "Generic (PLEG): container finished" podID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerID="9d5593e0d0b6f8eae977b4ebe7b4252a9d7b711437ec0f4ad389f06b7e7af83b" exitCode=0 Apr 16 18:20:20.700517 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.700468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" event={"ID":"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8","Type":"ContainerDied","Data":"9d5593e0d0b6f8eae977b4ebe7b4252a9d7b711437ec0f4ad389f06b7e7af83b"} Apr 16 18:20:20.700517 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:20.700489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" event={"ID":"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8","Type":"ContainerStarted","Data":"a24ab7bb47f0db4558537a0c7bbd15fb3309cd15a2b6562891553c05aaeac38e"} Apr 16 18:20:22.712720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.712666 2571 generic.go:358] "Generic (PLEG): container finished" podID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerID="aabcfd3e836c26fcfaaf0e4832b5b440a11dbfad94d4825451c434a2cd19b60c" exitCode=0 Apr 16 18:20:22.712720 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.712715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" event={"ID":"50585f9d-aa82-477c-a6f1-e3c0619b1eb5","Type":"ContainerDied","Data":"aabcfd3e836c26fcfaaf0e4832b5b440a11dbfad94d4825451c434a2cd19b60c"} Apr 16 18:20:22.714388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.714347 2571 generic.go:358] "Generic (PLEG): container finished" podID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerID="75db4943ae99db04b8b3c2c1dd1868470e0613906ca66deb842c2d4616dadc64" exitCode=0 Apr 16 18:20:22.714476 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.714422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" event={"ID":"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede","Type":"ContainerDied","Data":"75db4943ae99db04b8b3c2c1dd1868470e0613906ca66deb842c2d4616dadc64"} Apr 16 18:20:22.715966 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.715942 2571 generic.go:358] "Generic (PLEG): container finished" podID="c117a7cb-e729-40dd-8d73-c555838139f8" containerID="cde4d4b4169d3440d097b19efe06d377aba85e214e9e7a9c628e7f1457aa4bf6" exitCode=0 Apr 16 18:20:22.716057 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.716002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" event={"ID":"c117a7cb-e729-40dd-8d73-c555838139f8","Type":"ContainerDied","Data":"cde4d4b4169d3440d097b19efe06d377aba85e214e9e7a9c628e7f1457aa4bf6"} Apr 16 18:20:22.717751 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.717712 2571 generic.go:358] "Generic (PLEG): container finished" podID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerID="4af76027c5b3cdee7afa3dcb1e128c812a22e4ab98890fe9d28ab3894e7cd656" exitCode=0 Apr 16 18:20:22.717751 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:22.717747 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" event={"ID":"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8","Type":"ContainerDied","Data":"4af76027c5b3cdee7afa3dcb1e128c812a22e4ab98890fe9d28ab3894e7cd656"} Apr 16 18:20:23.723638 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.723599 2571 generic.go:358] "Generic (PLEG): container finished" podID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerID="e5187caa8438898fbfacb7feb85e4a79e2abcfdde062f0cd94120a0a51355a58" exitCode=0 Apr 16 18:20:23.724089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.723677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" event={"ID":"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede","Type":"ContainerDied","Data":"e5187caa8438898fbfacb7feb85e4a79e2abcfdde062f0cd94120a0a51355a58"} Apr 16 18:20:23.725402 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.725380 2571 generic.go:358] "Generic (PLEG): container finished" podID="c117a7cb-e729-40dd-8d73-c555838139f8" containerID="d51c1adc9c9bdd8e3f7c2b4785dbeece731aa407a0852bb54d8ab48742e21331" exitCode=0 Apr 16 18:20:23.725511 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.725442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" event={"ID":"c117a7cb-e729-40dd-8d73-c555838139f8","Type":"ContainerDied","Data":"d51c1adc9c9bdd8e3f7c2b4785dbeece731aa407a0852bb54d8ab48742e21331"} Apr 16 18:20:23.727129 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.727110 2571 generic.go:358] "Generic (PLEG): container finished" podID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerID="b9a836b654d9c35c084fe2ce8c0a716ebf8a0dc5acabcae6526c7f32f2f0b12c" exitCode=0 Apr 16 18:20:23.727231 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.727190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" event={"ID":"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8","Type":"ContainerDied","Data":"b9a836b654d9c35c084fe2ce8c0a716ebf8a0dc5acabcae6526c7f32f2f0b12c"} Apr 16 18:20:23.728985 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.728964 2571 generic.go:358] "Generic (PLEG): container finished" podID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerID="c198dd06fe29137dc8f3fc6ceb86458319458a6e6fefa26d8c4c17e11499cddc" exitCode=0 Apr 16 18:20:23.729061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:23.729010 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" event={"ID":"50585f9d-aa82-477c-a6f1-e3c0619b1eb5","Type":"ContainerDied","Data":"c198dd06fe29137dc8f3fc6ceb86458319458a6e6fefa26d8c4c17e11499cddc"} Apr 16 18:20:24.867789 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.867760 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:24.912939 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.912911 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:24.937199 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.937170 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:24.940714 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.940674 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:24.959297 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959266 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-bundle\") pod \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " Apr 16 18:20:24.959455 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959326 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-bundle\") pod \"c117a7cb-e729-40dd-8d73-c555838139f8\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " Apr 16 18:20:24.959455 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959367 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-util\") pod \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " Apr 16 18:20:24.959455 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959406 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dq57\" (UniqueName: \"kubernetes.io/projected/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-kube-api-access-2dq57\") pod \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " Apr 16 18:20:24.959455 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959438 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-util\") pod \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959480 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-bundle\") pod \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\" (UID: \"50585f9d-aa82-477c-a6f1-e3c0619b1eb5\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959501 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztmxs\" (UniqueName: \"kubernetes.io/projected/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-kube-api-access-ztmxs\") pod \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959523 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-util\") pod \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959559 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-util\") pod \"c117a7cb-e729-40dd-8d73-c555838139f8\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959602 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92b86\" (UniqueName: \"kubernetes.io/projected/c117a7cb-e729-40dd-8d73-c555838139f8-kube-api-access-92b86\") pod \"c117a7cb-e729-40dd-8d73-c555838139f8\" (UID: \"c117a7cb-e729-40dd-8d73-c555838139f8\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959631 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-bundle\") pod \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\" (UID: \"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959675 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8984\" (UniqueName: \"kubernetes.io/projected/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-kube-api-access-m8984\") pod \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\" (UID: \"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8\") " Apr 16 18:20:24.959883 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959875 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-bundle" (OuterVolumeSpecName: "bundle") pod "84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" (UID: "84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.960264 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.959962 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:24.964553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.960502 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-bundle" (OuterVolumeSpecName: "bundle") pod "c117a7cb-e729-40dd-8d73-c555838139f8" (UID: "c117a7cb-e729-40dd-8d73-c555838139f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.964553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.961178 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-bundle" (OuterVolumeSpecName: "bundle") pod "8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" (UID: "8a57bf6e-dea4-47e1-83e5-8e41a92a7ede"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.964553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.962785 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-kube-api-access-m8984" (OuterVolumeSpecName: "kube-api-access-m8984") pod "84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" (UID: "84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8"). InnerVolumeSpecName "kube-api-access-m8984". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:24.964553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.963253 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-kube-api-access-2dq57" (OuterVolumeSpecName: "kube-api-access-2dq57") pod "50585f9d-aa82-477c-a6f1-e3c0619b1eb5" (UID: "50585f9d-aa82-477c-a6f1-e3c0619b1eb5"). InnerVolumeSpecName "kube-api-access-2dq57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:24.964553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.963769 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-bundle" (OuterVolumeSpecName: "bundle") pod "50585f9d-aa82-477c-a6f1-e3c0619b1eb5" (UID: "50585f9d-aa82-477c-a6f1-e3c0619b1eb5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.964960 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.964926 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-kube-api-access-ztmxs" (OuterVolumeSpecName: "kube-api-access-ztmxs") pod "8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" (UID: "8a57bf6e-dea4-47e1-83e5-8e41a92a7ede"). InnerVolumeSpecName "kube-api-access-ztmxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:24.965188 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.965162 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c117a7cb-e729-40dd-8d73-c555838139f8-kube-api-access-92b86" (OuterVolumeSpecName: "kube-api-access-92b86") pod "c117a7cb-e729-40dd-8d73-c555838139f8" (UID: "c117a7cb-e729-40dd-8d73-c555838139f8"). InnerVolumeSpecName "kube-api-access-92b86". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:24.967606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.967567 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-util" (OuterVolumeSpecName: "util") pod "8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" (UID: "8a57bf6e-dea4-47e1-83e5-8e41a92a7ede"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.968048 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.968028 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-util" (OuterVolumeSpecName: "util") pod "c117a7cb-e729-40dd-8d73-c555838139f8" (UID: "c117a7cb-e729-40dd-8d73-c555838139f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.969427 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.969405 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-util" (OuterVolumeSpecName: "util") pod "84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" (UID: "84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:24.969656 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:24.969638 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-util" (OuterVolumeSpecName: "util") pod "50585f9d-aa82-477c-a6f1-e3c0619b1eb5" (UID: "50585f9d-aa82-477c-a6f1-e3c0619b1eb5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061096 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061125 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2dq57\" (UniqueName: \"kubernetes.io/projected/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-kube-api-access-2dq57\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061136 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061145 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50585f9d-aa82-477c-a6f1-e3c0619b1eb5-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061153 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ztmxs\" (UniqueName: \"kubernetes.io/projected/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-kube-api-access-ztmxs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061161 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061170 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-util\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061177 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92b86\" (UniqueName: \"kubernetes.io/projected/c117a7cb-e729-40dd-8d73-c555838139f8-kube-api-access-92b86\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061185 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a57bf6e-dea4-47e1-83e5-8e41a92a7ede-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061194 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8984\" (UniqueName: \"kubernetes.io/projected/84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8-kube-api-access-m8984\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.061600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.061203 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c117a7cb-e729-40dd-8d73-c555838139f8-bundle\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:20:25.737894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.737859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" event={"ID":"c117a7cb-e729-40dd-8d73-c555838139f8","Type":"ContainerDied","Data":"70c7e8b3053581d9422894c8503f7382ce74bb9ada9bb1d72c132a09e49e5326"} Apr 16 18:20:25.737894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.737890 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dqhsk" Apr 16 18:20:25.737894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.737899 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70c7e8b3053581d9422894c8503f7382ce74bb9ada9bb1d72c132a09e49e5326" Apr 16 18:20:25.739600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.739577 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" Apr 16 18:20:25.739600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.739592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbszzm" event={"ID":"84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8","Type":"ContainerDied","Data":"a24ab7bb47f0db4558537a0c7bbd15fb3309cd15a2b6562891553c05aaeac38e"} Apr 16 18:20:25.739795 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.739621 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24ab7bb47f0db4558537a0c7bbd15fb3309cd15a2b6562891553c05aaeac38e" Apr 16 18:20:25.742043 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.742016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" event={"ID":"50585f9d-aa82-477c-a6f1-e3c0619b1eb5","Type":"ContainerDied","Data":"a29f213ebd98b5a7bae5f4c8d403839d6200cb91da8c71ac6cb7bfc3759cbb70"} Apr 16 18:20:25.742148 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.742047 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a29f213ebd98b5a7bae5f4c8d403839d6200cb91da8c71ac6cb7bfc3759cbb70" Apr 16 18:20:25.742148 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.742120 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c305dzl4" Apr 16 18:20:25.743630 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.743603 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" event={"ID":"8a57bf6e-dea4-47e1-83e5-8e41a92a7ede","Type":"ContainerDied","Data":"361f0f8ab442f8285cfb0a1e009f75d8944eb15ab394522266a110446730d9e1"} Apr 16 18:20:25.743749 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.743630 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361f0f8ab442f8285cfb0a1e009f75d8944eb15ab394522266a110446730d9e1" Apr 16 18:20:25.743749 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:25.743669 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vg6lv" Apr 16 18:20:30.564931 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.564897 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j"] Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565246 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="pull" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565256 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="pull" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565266 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="extract" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565272 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="extract" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565283 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="util" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565289 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="util" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565296 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="pull" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565301 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="pull" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565308 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="extract" Apr 16 18:20:30.565310 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565313 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565321 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="util" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565326 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="util" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565333 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="util" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565338 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="util" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565345 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="pull" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565351 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="pull" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565356 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="pull" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565361 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="pull" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565366 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565371 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565381 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565386 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565394 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="util" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565399 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="util" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565445 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="50585f9d-aa82-477c-a6f1-e3c0619b1eb5" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565455 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c117a7cb-e729-40dd-8d73-c555838139f8" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565464 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a57bf6e-dea4-47e1-83e5-8e41a92a7ede" containerName="extract" Apr 16 18:20:30.565612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.565472 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="84ba3bbd-0686-4c91-a4dc-ff6e1a1a7be8" containerName="extract" Apr 16 18:20:30.569836 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.569818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:30.572035 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.572018 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:20:30.572091 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.572063 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:20:30.572722 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.572706 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-hxpvj\"" Apr 16 18:20:30.578044 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.578018 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j"] Apr 16 18:20:30.605906 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.605871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qv9\" (UniqueName: \"kubernetes.io/projected/7974d0e8-54e4-4604-9393-05718ce1254f-kube-api-access-n5qv9\") pod \"limitador-operator-controller-manager-c7fb4c8d5-6bv4j\" (UID: \"7974d0e8-54e4-4604-9393-05718ce1254f\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:30.706891 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.706852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qv9\" (UniqueName: \"kubernetes.io/projected/7974d0e8-54e4-4604-9393-05718ce1254f-kube-api-access-n5qv9\") pod \"limitador-operator-controller-manager-c7fb4c8d5-6bv4j\" (UID: \"7974d0e8-54e4-4604-9393-05718ce1254f\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:30.719317 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.719285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qv9\" (UniqueName: \"kubernetes.io/projected/7974d0e8-54e4-4604-9393-05718ce1254f-kube-api-access-n5qv9\") pod \"limitador-operator-controller-manager-c7fb4c8d5-6bv4j\" (UID: \"7974d0e8-54e4-4604-9393-05718ce1254f\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:30.881180 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:30.881081 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:31.037253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:31.037226 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j"] Apr 16 18:20:31.038377 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:31.038347 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7974d0e8_54e4_4604_9393_05718ce1254f.slice/crio-5bb6e944184b142b10bb60b36c260b5666efb57da000c61f354d7ff6ff8a4aaf WatchSource:0}: Error finding container 5bb6e944184b142b10bb60b36c260b5666efb57da000c61f354d7ff6ff8a4aaf: Status 404 returned error can't find the container with id 5bb6e944184b142b10bb60b36c260b5666efb57da000c61f354d7ff6ff8a4aaf Apr 16 18:20:31.768216 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:31.768180 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" event={"ID":"7974d0e8-54e4-4604-9393-05718ce1254f","Type":"ContainerStarted","Data":"5bb6e944184b142b10bb60b36c260b5666efb57da000c61f354d7ff6ff8a4aaf"} Apr 16 18:20:36.090655 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.090615 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6"] Apr 16 18:20:36.095415 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.095397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:36.098124 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.098102 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 18:20:36.098222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.098207 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-n8fvb\"" Apr 16 18:20:36.106074 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.106050 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6"] Apr 16 18:20:36.153516 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.153476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmbj\" (UniqueName: \"kubernetes.io/projected/ca47344b-f70c-4c79-b44e-89867551a23e-kube-api-access-cjmbj\") pod \"dns-operator-controller-manager-844548ff4c-4qks6\" (UID: \"ca47344b-f70c-4c79-b44e-89867551a23e\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:36.254924 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.254883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmbj\" (UniqueName: \"kubernetes.io/projected/ca47344b-f70c-4c79-b44e-89867551a23e-kube-api-access-cjmbj\") pod \"dns-operator-controller-manager-844548ff4c-4qks6\" (UID: \"ca47344b-f70c-4c79-b44e-89867551a23e\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:36.264796 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.264768 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmbj\" (UniqueName: \"kubernetes.io/projected/ca47344b-f70c-4c79-b44e-89867551a23e-kube-api-access-cjmbj\") pod \"dns-operator-controller-manager-844548ff4c-4qks6\" (UID: \"ca47344b-f70c-4c79-b44e-89867551a23e\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:36.405900 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.405799 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:36.532662 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.532634 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6"] Apr 16 18:20:36.534066 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:36.534041 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca47344b_f70c_4c79_b44e_89867551a23e.slice/crio-babaaa9063e642e24b902834304658eb5da2110b912aa9f7496d5ccae9a2877b WatchSource:0}: Error finding container babaaa9063e642e24b902834304658eb5da2110b912aa9f7496d5ccae9a2877b: Status 404 returned error can't find the container with id babaaa9063e642e24b902834304658eb5da2110b912aa9f7496d5ccae9a2877b Apr 16 18:20:36.791423 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:36.791388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" event={"ID":"ca47344b-f70c-4c79-b44e-89867551a23e","Type":"ContainerStarted","Data":"babaaa9063e642e24b902834304658eb5da2110b912aa9f7496d5ccae9a2877b"} Apr 16 18:20:37.797600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:37.797544 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" event={"ID":"7974d0e8-54e4-4604-9393-05718ce1254f","Type":"ContainerStarted","Data":"d8de86c16938f95cd570052ddad8cb803930a1ffbdb9fe022de1099b4f863611"} Apr 16 18:20:37.798065 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:37.797706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:38.802523 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.802487 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" event={"ID":"ca47344b-f70c-4c79-b44e-89867551a23e","Type":"ContainerStarted","Data":"e683eb76b8eb7b63e09917c3d1cc367d52c6102b23e0ab35929f7b1cae7f1d5e"} Apr 16 18:20:38.802979 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.802640 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:38.810105 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.810057 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" podStartSLOduration=2.698781497 podStartE2EDuration="8.810041642s" podCreationTimestamp="2026-04-16 18:20:30 +0000 UTC" firstStartedPulling="2026-04-16 18:20:31.040536324 +0000 UTC m=+655.965086953" lastFinishedPulling="2026-04-16 18:20:37.151796467 +0000 UTC m=+662.076347098" observedRunningTime="2026-04-16 18:20:37.817400107 +0000 UTC m=+662.741950759" watchObservedRunningTime="2026-04-16 18:20:38.810041642 +0000 UTC m=+663.734592294" Apr 16 18:20:38.812012 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.811988 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-8fwsb"] Apr 16 18:20:38.816631 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.816614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:20:38.818836 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.818819 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5pd8p\"" Apr 16 18:20:38.830938 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.830910 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-8fwsb"] Apr 16 18:20:38.836856 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.836816 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" podStartSLOduration=1.273435693 podStartE2EDuration="2.836803662s" podCreationTimestamp="2026-04-16 18:20:36 +0000 UTC" firstStartedPulling="2026-04-16 18:20:36.536034525 +0000 UTC m=+661.460585157" lastFinishedPulling="2026-04-16 18:20:38.099402497 +0000 UTC m=+663.023953126" observedRunningTime="2026-04-16 18:20:38.834366353 +0000 UTC m=+663.758917017" watchObservedRunningTime="2026-04-16 18:20:38.836803662 +0000 UTC m=+663.761354312" Apr 16 18:20:38.880678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.880640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wm5\" (UniqueName: \"kubernetes.io/projected/fa8f1e9c-361b-4bc5-9656-85377e6587a7-kube-api-access-m6wm5\") pod \"authorino-operator-7587b89b76-8fwsb\" (UID: \"fa8f1e9c-361b-4bc5-9656-85377e6587a7\") " pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:20:38.981753 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.981721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wm5\" (UniqueName: \"kubernetes.io/projected/fa8f1e9c-361b-4bc5-9656-85377e6587a7-kube-api-access-m6wm5\") pod \"authorino-operator-7587b89b76-8fwsb\" (UID: \"fa8f1e9c-361b-4bc5-9656-85377e6587a7\") " pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:20:38.990755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:38.990722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wm5\" (UniqueName: \"kubernetes.io/projected/fa8f1e9c-361b-4bc5-9656-85377e6587a7-kube-api-access-m6wm5\") pod \"authorino-operator-7587b89b76-8fwsb\" (UID: \"fa8f1e9c-361b-4bc5-9656-85377e6587a7\") " pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:20:39.126834 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:39.126720 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:20:39.254000 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:39.253974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-8fwsb"] Apr 16 18:20:39.255698 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:20:39.255655 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa8f1e9c_361b_4bc5_9656_85377e6587a7.slice/crio-d8b5293bf9cf08f146549e15d1106de1652fe5a8243c764582a4d78003da532f WatchSource:0}: Error finding container d8b5293bf9cf08f146549e15d1106de1652fe5a8243c764582a4d78003da532f: Status 404 returned error can't find the container with id d8b5293bf9cf08f146549e15d1106de1652fe5a8243c764582a4d78003da532f Apr 16 18:20:39.807251 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:39.807209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" event={"ID":"fa8f1e9c-361b-4bc5-9656-85377e6587a7","Type":"ContainerStarted","Data":"d8b5293bf9cf08f146549e15d1106de1652fe5a8243c764582a4d78003da532f"} Apr 16 18:20:41.817093 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:41.817054 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" event={"ID":"fa8f1e9c-361b-4bc5-9656-85377e6587a7","Type":"ContainerStarted","Data":"9e800331883f4fc2f15c6d75ffc3ebc2a83ec162ccda90706093e1b71c6e317f"} Apr 16 18:20:41.817446 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:41.817282 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:20:41.836369 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:41.836320 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" podStartSLOduration=2.147385882 podStartE2EDuration="3.836306677s" podCreationTimestamp="2026-04-16 18:20:38 +0000 UTC" firstStartedPulling="2026-04-16 18:20:39.257721435 +0000 UTC m=+664.182272067" lastFinishedPulling="2026-04-16 18:20:40.946642234 +0000 UTC m=+665.871192862" observedRunningTime="2026-04-16 18:20:41.833621332 +0000 UTC m=+666.758171977" watchObservedRunningTime="2026-04-16 18:20:41.836306677 +0000 UTC m=+666.760857327" Apr 16 18:20:48.804821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:48.804787 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-6bv4j" Apr 16 18:20:49.810601 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:49.810554 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4qks6" Apr 16 18:20:52.823635 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:20:52.823602 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-8fwsb" Apr 16 18:21:25.229302 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.229227 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:25.231564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.231545 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.234031 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.234005 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 18:21:25.234139 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.234075 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4727g\"" Apr 16 18:21:25.240851 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.240824 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:25.280571 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.280533 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-config-file\") pod \"limitador-limitador-64c8f475fb-kgc52\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.280771 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.280599 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbwt\" (UniqueName: \"kubernetes.io/projected/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-kube-api-access-5bbwt\") pod \"limitador-limitador-64c8f475fb-kgc52\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.324519 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.324483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:25.381578 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.381543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-config-file\") pod \"limitador-limitador-64c8f475fb-kgc52\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.381765 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.381611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbwt\" (UniqueName: \"kubernetes.io/projected/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-kube-api-access-5bbwt\") pod \"limitador-limitador-64c8f475fb-kgc52\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.382227 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.382183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-config-file\") pod \"limitador-limitador-64c8f475fb-kgc52\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.389875 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.389845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbwt\" (UniqueName: \"kubernetes.io/projected/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-kube-api-access-5bbwt\") pod \"limitador-limitador-64c8f475fb-kgc52\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.543566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.543478 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:25.676209 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.676184 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:25.680914 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:21:25.680882 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d861adf_253d_4e10_b6e5_dfc1f9b6c353.slice/crio-7f63190dcdc3fb5987fada3785b1c4e110a984538fcd42d3440b51919c7dee76 WatchSource:0}: Error finding container 7f63190dcdc3fb5987fada3785b1c4e110a984538fcd42d3440b51919c7dee76: Status 404 returned error can't find the container with id 7f63190dcdc3fb5987fada3785b1c4e110a984538fcd42d3440b51919c7dee76 Apr 16 18:21:25.990054 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:25.990015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" event={"ID":"0d861adf-253d-4e10-b6e5-dfc1f9b6c353","Type":"ContainerStarted","Data":"7f63190dcdc3fb5987fada3785b1c4e110a984538fcd42d3440b51919c7dee76"} Apr 16 18:21:30.010721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:30.010595 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" event={"ID":"0d861adf-253d-4e10-b6e5-dfc1f9b6c353","Type":"ContainerStarted","Data":"7219ccb566e47f5cd5f2bf5999887b5bebddaa5fe0799c7a9545268cc8b32260"} Apr 16 18:21:30.011176 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:30.010763 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:30.028844 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:30.028793 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" podStartSLOduration=1.004658203 podStartE2EDuration="5.028777949s" podCreationTimestamp="2026-04-16 18:21:25 +0000 UTC" firstStartedPulling="2026-04-16 18:21:25.682750817 +0000 UTC m=+710.607301447" lastFinishedPulling="2026-04-16 18:21:29.70687056 +0000 UTC m=+714.631421193" observedRunningTime="2026-04-16 18:21:30.027232286 +0000 UTC m=+714.951782942" watchObservedRunningTime="2026-04-16 18:21:30.028777949 +0000 UTC m=+714.953328600" Apr 16 18:21:41.015988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:41.015955 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:41.434743 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:41.434639 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:41.434925 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:41.434872 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" podUID="0d861adf-253d-4e10-b6e5-dfc1f9b6c353" containerName="limitador" containerID="cri-o://7219ccb566e47f5cd5f2bf5999887b5bebddaa5fe0799c7a9545268cc8b32260" gracePeriod=30 Apr 16 18:21:42.057010 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.056925 2571 generic.go:358] "Generic (PLEG): container finished" podID="0d861adf-253d-4e10-b6e5-dfc1f9b6c353" containerID="7219ccb566e47f5cd5f2bf5999887b5bebddaa5fe0799c7a9545268cc8b32260" exitCode=0 Apr 16 18:21:42.057355 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.057003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" event={"ID":"0d861adf-253d-4e10-b6e5-dfc1f9b6c353","Type":"ContainerDied","Data":"7219ccb566e47f5cd5f2bf5999887b5bebddaa5fe0799c7a9545268cc8b32260"} Apr 16 18:21:42.378903 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.378879 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:42.433096 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.433061 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-config-file\") pod \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " Apr 16 18:21:42.433262 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.433160 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bbwt\" (UniqueName: \"kubernetes.io/projected/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-kube-api-access-5bbwt\") pod \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\" (UID: \"0d861adf-253d-4e10-b6e5-dfc1f9b6c353\") " Apr 16 18:21:42.433483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.433455 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-config-file" (OuterVolumeSpecName: "config-file") pod "0d861adf-253d-4e10-b6e5-dfc1f9b6c353" (UID: "0d861adf-253d-4e10-b6e5-dfc1f9b6c353"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:42.435314 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.435290 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-kube-api-access-5bbwt" (OuterVolumeSpecName: "kube-api-access-5bbwt") pod "0d861adf-253d-4e10-b6e5-dfc1f9b6c353" (UID: "0d861adf-253d-4e10-b6e5-dfc1f9b6c353"). InnerVolumeSpecName "kube-api-access-5bbwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:42.534772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.534731 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-config-file\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:21:42.534772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:42.534765 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bbwt\" (UniqueName: \"kubernetes.io/projected/0d861adf-253d-4e10-b6e5-dfc1f9b6c353-kube-api-access-5bbwt\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:21:43.062329 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:43.062300 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" Apr 16 18:21:43.062329 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:43.062320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-kgc52" event={"ID":"0d861adf-253d-4e10-b6e5-dfc1f9b6c353","Type":"ContainerDied","Data":"7f63190dcdc3fb5987fada3785b1c4e110a984538fcd42d3440b51919c7dee76"} Apr 16 18:21:43.062818 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:43.062367 2571 scope.go:117] "RemoveContainer" containerID="7219ccb566e47f5cd5f2bf5999887b5bebddaa5fe0799c7a9545268cc8b32260" Apr 16 18:21:43.083953 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:43.083928 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:43.087943 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:43.087921 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-kgc52"] Apr 16 18:21:43.641553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:21:43.641521 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d861adf-253d-4e10-b6e5-dfc1f9b6c353" path="/var/lib/kubelet/pods/0d861adf-253d-4e10-b6e5-dfc1f9b6c353/volumes" Apr 16 18:22:00.534906 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.534866 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476"] Apr 16 18:22:00.535351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.535234 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d861adf-253d-4e10-b6e5-dfc1f9b6c353" containerName="limitador" Apr 16 18:22:00.535351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.535246 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d861adf-253d-4e10-b6e5-dfc1f9b6c353" containerName="limitador" Apr 16 18:22:00.535351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.535307 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d861adf-253d-4e10-b6e5-dfc1f9b6c353" containerName="limitador" Apr 16 18:22:00.538501 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.538484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.551476 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.551446 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476"] Apr 16 18:22:00.695740 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.695700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ed490bc6-e334-4254-8567-902724b2a88e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.695951 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.695760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.695951 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.695811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ed490bc6-e334-4254-8567-902724b2a88e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.695951 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.695908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ed490bc6-e334-4254-8567-902724b2a88e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.696138 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.695974 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.696138 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.696010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxjw\" (UniqueName: \"kubernetes.io/projected/ed490bc6-e334-4254-8567-902724b2a88e-kube-api-access-jsxjw\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.696138 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.696028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.796499 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.796415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ed490bc6-e334-4254-8567-902724b2a88e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.796499 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.796483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.796680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.796555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ed490bc6-e334-4254-8567-902724b2a88e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.797007 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.796971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ed490bc6-e334-4254-8567-902724b2a88e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.797153 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.797091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.797222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.797157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxjw\" (UniqueName: \"kubernetes.io/projected/ed490bc6-e334-4254-8567-902724b2a88e-kube-api-access-jsxjw\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.797222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.797188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.797222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.797204 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ed490bc6-e334-4254-8567-902724b2a88e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.799070 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.799042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.799409 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.799388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.799540 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.799512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ed490bc6-e334-4254-8567-902724b2a88e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.799678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.799660 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ed490bc6-e334-4254-8567-902724b2a88e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.816651 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.816623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ed490bc6-e334-4254-8567-902724b2a88e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.831872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.831849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxjw\" (UniqueName: \"kubernetes.io/projected/ed490bc6-e334-4254-8567-902724b2a88e-kube-api-access-jsxjw\") pod \"istiod-openshift-gateway-55ff986f96-l7476\" (UID: \"ed490bc6-e334-4254-8567-902724b2a88e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.849762 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.849733 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:00.987182 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.987155 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476"] Apr 16 18:22:00.988748 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:22:00.988722 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded490bc6_e334_4254_8567_902724b2a88e.slice/crio-4596edc71f5c756cbb762dfa9df496c9e2d18d6623fc4badd61a933e806ee6ee WatchSource:0}: Error finding container 4596edc71f5c756cbb762dfa9df496c9e2d18d6623fc4badd61a933e806ee6ee: Status 404 returned error can't find the container with id 4596edc71f5c756cbb762dfa9df496c9e2d18d6623fc4badd61a933e806ee6ee Apr 16 18:22:00.990873 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.990838 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:22:00.990994 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:00.990936 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:22:01.140196 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:01.140160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" event={"ID":"ed490bc6-e334-4254-8567-902724b2a88e","Type":"ContainerStarted","Data":"eafa462e070502cab95a3017ec237a6eec43725c64d041c14fb1f09d3ead6f6c"} Apr 16 18:22:01.140196 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:01.140200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" event={"ID":"ed490bc6-e334-4254-8567-902724b2a88e","Type":"ContainerStarted","Data":"4596edc71f5c756cbb762dfa9df496c9e2d18d6623fc4badd61a933e806ee6ee"} Apr 16 18:22:01.140416 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:01.140274 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:01.167938 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:01.167874 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" podStartSLOduration=1.167855208 podStartE2EDuration="1.167855208s" podCreationTimestamp="2026-04-16 18:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:01.166244875 +0000 UTC m=+746.090795527" watchObservedRunningTime="2026-04-16 18:22:01.167855208 +0000 UTC m=+746.092405860" Apr 16 18:22:02.145912 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:02.145885 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-l7476" Apr 16 18:22:02.243980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:02.243916 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq"] Apr 16 18:22:02.244242 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:02.244218 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" podUID="d9086ce5-8be0-4692-8e7d-1e66a0200a6e" containerName="discovery" containerID="cri-o://6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95" gracePeriod=30 Apr 16 18:22:02.994154 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:02.994127 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:22:03.120896 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.120808 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-local-certs\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.120896 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.120884 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-cacerts\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.121078 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.120916 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-kubeconfig\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.121078 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.120952 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-ca-configmap\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.121179 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.121084 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-token\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.121179 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.121134 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-dns-cert\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.121490 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.121200 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm8qw\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-kube-api-access-zm8qw\") pod \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\" (UID: \"d9086ce5-8be0-4692-8e7d-1e66a0200a6e\") " Apr 16 18:22:03.121490 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.121331 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:03.121626 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.121529 2571 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-ca-configmap\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.123509 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.123471 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.123637 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.123502 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-local-certs" (OuterVolumeSpecName: "local-certs") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:03.123637 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.123599 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.123768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.123668 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-cacerts" (OuterVolumeSpecName: "cacerts") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.123768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.123673 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-token" (OuterVolumeSpecName: "istio-token") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:03.123863 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.123798 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-kube-api-access-zm8qw" (OuterVolumeSpecName: "kube-api-access-zm8qw") pod "d9086ce5-8be0-4692-8e7d-1e66a0200a6e" (UID: "d9086ce5-8be0-4692-8e7d-1e66a0200a6e"). InnerVolumeSpecName "kube-api-access-zm8qw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:03.149502 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.149466 2571 generic.go:358] "Generic (PLEG): container finished" podID="d9086ce5-8be0-4692-8e7d-1e66a0200a6e" containerID="6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95" exitCode=0 Apr 16 18:22:03.149954 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.149555 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" Apr 16 18:22:03.149954 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.149558 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" event={"ID":"d9086ce5-8be0-4692-8e7d-1e66a0200a6e","Type":"ContainerDied","Data":"6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95"} Apr 16 18:22:03.149954 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.149660 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq" event={"ID":"d9086ce5-8be0-4692-8e7d-1e66a0200a6e","Type":"ContainerDied","Data":"913010c7bc8852eca942ccee384143bf1171701fffafeb1d9aec87039282b0c4"} Apr 16 18:22:03.149954 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.149677 2571 scope.go:117] "RemoveContainer" containerID="6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95" Apr 16 18:22:03.159838 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.159816 2571 scope.go:117] "RemoveContainer" containerID="6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95" Apr 16 18:22:03.160242 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:22:03.160210 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95\": container with ID starting with 6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95 not found: ID does not exist" containerID="6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95" Apr 16 18:22:03.160337 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.160258 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95"} err="failed to get container status \"6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95\": rpc error: code = NotFound desc = could not find container \"6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95\": container with ID starting with 6d0e201319f7eac3fc9d31365103e611368717e04f5aaa0b5ddb9bcd35c76c95 not found: ID does not exist" Apr 16 18:22:03.192487 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.192312 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq"] Apr 16 18:22:03.194798 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.194774 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qvnpq"] Apr 16 18:22:03.222452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.222405 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zm8qw\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-kube-api-access-zm8qw\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.222452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.222444 2571 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-local-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.222452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.222460 2571 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-cacerts\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.222739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.222471 2571 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-kubeconfig\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.222739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.222482 2571 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-token\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.222739 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.222494 2571 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9086ce5-8be0-4692-8e7d-1e66a0200a6e-istio-csr-dns-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.641266 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:03.641230 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9086ce5-8be0-4692-8e7d-1e66a0200a6e" path="/var/lib/kubelet/pods/d9086ce5-8be0-4692-8e7d-1e66a0200a6e/volumes" Apr 16 18:22:10.224279 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.224246 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-65589c6846-vrhv6"] Apr 16 18:22:10.224645 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.224613 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9086ce5-8be0-4692-8e7d-1e66a0200a6e" containerName="discovery" Apr 16 18:22:10.224645 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.224623 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9086ce5-8be0-4692-8e7d-1e66a0200a6e" containerName="discovery" Apr 16 18:22:10.224731 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.224700 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9086ce5-8be0-4692-8e7d-1e66a0200a6e" containerName="discovery" Apr 16 18:22:10.229088 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.229067 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.231427 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.231295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:22:10.232093 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.232073 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:22:10.232183 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.232098 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-vh6l9\"" Apr 16 18:22:10.232183 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.232105 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:22:10.238500 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.238474 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-vrhv6"] Apr 16 18:22:10.266360 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.266324 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-9x7pk"] Apr 16 18:22:10.270113 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.270091 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.272248 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.272218 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wqstx\"" Apr 16 18:22:10.272369 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.272270 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:22:10.276727 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.276704 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-9x7pk"] Apr 16 18:22:10.391050 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.391016 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-cert\") pod \"kserve-controller-manager-65589c6846-vrhv6\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.391237 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.391075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8ffccd3f-00c7-45cb-a0c4-cf40117a8e42-data\") pod \"seaweedfs-86cc847c5c-9x7pk\" (UID: \"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42\") " pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.391237 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.391157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnd6b\" (UniqueName: \"kubernetes.io/projected/8ffccd3f-00c7-45cb-a0c4-cf40117a8e42-kube-api-access-vnd6b\") pod \"seaweedfs-86cc847c5c-9x7pk\" (UID: \"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42\") " pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.391237 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.391194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcl6\" (UniqueName: \"kubernetes.io/projected/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-kube-api-access-gxcl6\") pod \"kserve-controller-manager-65589c6846-vrhv6\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.492358 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.492265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8ffccd3f-00c7-45cb-a0c4-cf40117a8e42-data\") pod \"seaweedfs-86cc847c5c-9x7pk\" (UID: \"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42\") " pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.492358 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.492315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnd6b\" (UniqueName: \"kubernetes.io/projected/8ffccd3f-00c7-45cb-a0c4-cf40117a8e42-kube-api-access-vnd6b\") pod \"seaweedfs-86cc847c5c-9x7pk\" (UID: \"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42\") " pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.492358 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.492339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcl6\" (UniqueName: \"kubernetes.io/projected/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-kube-api-access-gxcl6\") pod \"kserve-controller-manager-65589c6846-vrhv6\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.492606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.492478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-cert\") pod \"kserve-controller-manager-65589c6846-vrhv6\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.492752 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.492731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8ffccd3f-00c7-45cb-a0c4-cf40117a8e42-data\") pod \"seaweedfs-86cc847c5c-9x7pk\" (UID: \"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42\") " pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.495051 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.495029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-cert\") pod \"kserve-controller-manager-65589c6846-vrhv6\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.502210 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.502187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcl6\" (UniqueName: \"kubernetes.io/projected/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-kube-api-access-gxcl6\") pod \"kserve-controller-manager-65589c6846-vrhv6\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.502820 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.502804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnd6b\" (UniqueName: \"kubernetes.io/projected/8ffccd3f-00c7-45cb-a0c4-cf40117a8e42-kube-api-access-vnd6b\") pod \"seaweedfs-86cc847c5c-9x7pk\" (UID: \"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42\") " pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.541464 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.541428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:10.581165 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.581127 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:10.692668 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.692593 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-vrhv6"] Apr 16 18:22:10.746853 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:10.746826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-9x7pk"] Apr 16 18:22:10.748258 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:22:10.748231 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ffccd3f_00c7_45cb_a0c4_cf40117a8e42.slice/crio-5008946efdeeb2becb263bb9667b5291ae4af824cb202288c9256f82c3c50f77 WatchSource:0}: Error finding container 5008946efdeeb2becb263bb9667b5291ae4af824cb202288c9256f82c3c50f77: Status 404 returned error can't find the container with id 5008946efdeeb2becb263bb9667b5291ae4af824cb202288c9256f82c3c50f77 Apr 16 18:22:11.187934 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:11.187851 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" event={"ID":"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3","Type":"ContainerStarted","Data":"a6efc2398d44a03644676a1864b688fe1119e97dd202b300dca3d91f84377efe"} Apr 16 18:22:11.189397 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:11.189357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-9x7pk" event={"ID":"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42","Type":"ContainerStarted","Data":"5008946efdeeb2becb263bb9667b5291ae4af824cb202288c9256f82c3c50f77"} Apr 16 18:22:15.212854 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:15.212804 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-9x7pk" event={"ID":"8ffccd3f-00c7-45cb-a0c4-cf40117a8e42","Type":"ContainerStarted","Data":"0f9dc80456f65f4b9995fd1d65a6163e089ed9d7626602bd5f4f2e483f223fc2"} Apr 16 18:22:15.213286 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:15.212899 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:15.214272 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:15.214248 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" event={"ID":"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3","Type":"ContainerStarted","Data":"86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363"} Apr 16 18:22:15.214405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:15.214392 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:15.244394 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:15.244348 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" podStartSLOduration=1.33916352 podStartE2EDuration="5.244333487s" podCreationTimestamp="2026-04-16 18:22:10 +0000 UTC" firstStartedPulling="2026-04-16 18:22:10.704924075 +0000 UTC m=+755.629474718" lastFinishedPulling="2026-04-16 18:22:14.610094053 +0000 UTC m=+759.534644685" observedRunningTime="2026-04-16 18:22:15.243153881 +0000 UTC m=+760.167704544" watchObservedRunningTime="2026-04-16 18:22:15.244333487 +0000 UTC m=+760.168884175" Apr 16 18:22:15.244650 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:15.244630 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-9x7pk" podStartSLOduration=1.3240498889999999 podStartE2EDuration="5.244626121s" podCreationTimestamp="2026-04-16 18:22:10 +0000 UTC" firstStartedPulling="2026-04-16 18:22:10.749615563 +0000 UTC m=+755.674166197" lastFinishedPulling="2026-04-16 18:22:14.670191795 +0000 UTC m=+759.594742429" observedRunningTime="2026-04-16 18:22:15.228777141 +0000 UTC m=+760.153327803" watchObservedRunningTime="2026-04-16 18:22:15.244626121 +0000 UTC m=+760.169176840" Apr 16 18:22:21.220573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:21.220531 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-9x7pk" Apr 16 18:22:46.224016 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:46.223986 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:47.419044 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.418962 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-vrhv6"] Apr 16 18:22:47.419458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.419200 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" podUID="84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" containerName="manager" containerID="cri-o://86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363" gracePeriod=10 Apr 16 18:22:47.443430 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.443401 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-65589c6846-8q79f"] Apr 16 18:22:47.445611 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.445596 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.454066 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.454040 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-8q79f"] Apr 16 18:22:47.512712 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.512662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t522m\" (UniqueName: \"kubernetes.io/projected/7739e1d0-3989-401f-b25d-9a79eb91b7fa-kube-api-access-t522m\") pod \"kserve-controller-manager-65589c6846-8q79f\" (UID: \"7739e1d0-3989-401f-b25d-9a79eb91b7fa\") " pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.512840 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.512729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7739e1d0-3989-401f-b25d-9a79eb91b7fa-cert\") pod \"kserve-controller-manager-65589c6846-8q79f\" (UID: \"7739e1d0-3989-401f-b25d-9a79eb91b7fa\") " pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.613211 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.613182 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t522m\" (UniqueName: \"kubernetes.io/projected/7739e1d0-3989-401f-b25d-9a79eb91b7fa-kube-api-access-t522m\") pod \"kserve-controller-manager-65589c6846-8q79f\" (UID: \"7739e1d0-3989-401f-b25d-9a79eb91b7fa\") " pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.613374 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.613277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7739e1d0-3989-401f-b25d-9a79eb91b7fa-cert\") pod \"kserve-controller-manager-65589c6846-8q79f\" (UID: \"7739e1d0-3989-401f-b25d-9a79eb91b7fa\") " pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.615786 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.615760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7739e1d0-3989-401f-b25d-9a79eb91b7fa-cert\") pod \"kserve-controller-manager-65589c6846-8q79f\" (UID: \"7739e1d0-3989-401f-b25d-9a79eb91b7fa\") " pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.623005 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.622385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t522m\" (UniqueName: \"kubernetes.io/projected/7739e1d0-3989-401f-b25d-9a79eb91b7fa-kube-api-access-t522m\") pod \"kserve-controller-manager-65589c6846-8q79f\" (UID: \"7739e1d0-3989-401f-b25d-9a79eb91b7fa\") " pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.683131 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.683109 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:47.804393 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.804353 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:47.815327 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.815303 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-cert\") pod \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " Apr 16 18:22:47.815422 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.815389 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxcl6\" (UniqueName: \"kubernetes.io/projected/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-kube-api-access-gxcl6\") pod \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\" (UID: \"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3\") " Apr 16 18:22:47.817473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.817444 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-kube-api-access-gxcl6" (OuterVolumeSpecName: "kube-api-access-gxcl6") pod "84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" (UID: "84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3"). InnerVolumeSpecName "kube-api-access-gxcl6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:47.817473 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.817464 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-cert" (OuterVolumeSpecName: "cert") pod "84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" (UID: "84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:47.916746 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.916713 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-cert\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:47.916746 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:47.916746 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxcl6\" (UniqueName: \"kubernetes.io/projected/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3-kube-api-access-gxcl6\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:22:48.136168 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.136140 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-8q79f"] Apr 16 18:22:48.138663 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:22:48.138635 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7739e1d0_3989_401f_b25d_9a79eb91b7fa.slice/crio-401c9ab45beba01dfa0933dff2a74940c1ffe9ea666382e83fa7ba9171da810d WatchSource:0}: Error finding container 401c9ab45beba01dfa0933dff2a74940c1ffe9ea666382e83fa7ba9171da810d: Status 404 returned error can't find the container with id 401c9ab45beba01dfa0933dff2a74940c1ffe9ea666382e83fa7ba9171da810d Apr 16 18:22:48.139982 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.139962 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:22:48.346434 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.346395 2571 generic.go:358] "Generic (PLEG): container finished" podID="84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" containerID="86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363" exitCode=0 Apr 16 18:22:48.346620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.346469 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" Apr 16 18:22:48.346620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.346481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" event={"ID":"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3","Type":"ContainerDied","Data":"86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363"} Apr 16 18:22:48.346620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.346521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-vrhv6" event={"ID":"84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3","Type":"ContainerDied","Data":"a6efc2398d44a03644676a1864b688fe1119e97dd202b300dca3d91f84377efe"} Apr 16 18:22:48.346620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.346537 2571 scope.go:117] "RemoveContainer" containerID="86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363" Apr 16 18:22:48.347606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.347583 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-8q79f" event={"ID":"7739e1d0-3989-401f-b25d-9a79eb91b7fa","Type":"ContainerStarted","Data":"401c9ab45beba01dfa0933dff2a74940c1ffe9ea666382e83fa7ba9171da810d"} Apr 16 18:22:48.355318 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.355303 2571 scope.go:117] "RemoveContainer" containerID="86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363" Apr 16 18:22:48.355585 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:22:48.355567 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363\": container with ID starting with 86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363 not found: ID does not exist" containerID="86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363" Apr 16 18:22:48.355626 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.355602 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363"} err="failed to get container status \"86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363\": rpc error: code = NotFound desc = could not find container \"86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363\": container with ID starting with 86db0623fda4d143d40a991c9870b6d05d6c117fcaee84466dceaa92146ef363 not found: ID does not exist" Apr 16 18:22:48.368374 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.368342 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-vrhv6"] Apr 16 18:22:48.372306 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:48.372283 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-vrhv6"] Apr 16 18:22:49.353422 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:49.353389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-8q79f" event={"ID":"7739e1d0-3989-401f-b25d-9a79eb91b7fa","Type":"ContainerStarted","Data":"5f0cd377dc72d773ad2b28776e9f50af8fbf78890eac4dfcdfb5e4a9c98e1206"} Apr 16 18:22:49.353870 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:49.353479 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:22:49.373042 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:49.372990 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-65589c6846-8q79f" podStartSLOduration=2.004247031 podStartE2EDuration="2.372972059s" podCreationTimestamp="2026-04-16 18:22:47 +0000 UTC" firstStartedPulling="2026-04-16 18:22:48.140120095 +0000 UTC m=+793.064670724" lastFinishedPulling="2026-04-16 18:22:48.508845112 +0000 UTC m=+793.433395752" observedRunningTime="2026-04-16 18:22:49.370970071 +0000 UTC m=+794.295520723" watchObservedRunningTime="2026-04-16 18:22:49.372972059 +0000 UTC m=+794.297522709" Apr 16 18:22:49.642079 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:22:49.641986 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" path="/var/lib/kubelet/pods/84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3/volumes" Apr 16 18:23:20.364569 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:20.364537 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-65589c6846-8q79f" Apr 16 18:23:21.337888 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.337850 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-nxzz9"] Apr 16 18:23:21.338253 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.338240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" containerName="manager" Apr 16 18:23:21.338305 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.338254 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" containerName="manager" Apr 16 18:23:21.338349 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.338312 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="84d7f6be-9fd7-448e-adaf-e9d7b0b27ce3" containerName="manager" Apr 16 18:23:21.340457 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.340436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.342887 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.342869 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:23:21.343517 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.343500 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cn69p\"" Apr 16 18:23:21.350904 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.350876 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nxzz9"] Apr 16 18:23:21.358890 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.358864 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-zw59l"] Apr 16 18:23:21.361325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.361305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:21.363601 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.363581 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-mk5mh\"" Apr 16 18:23:21.363811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.363644 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:23:21.369579 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.369467 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-zw59l"] Apr 16 18:23:21.410936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.410893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9frc\" (UniqueName: \"kubernetes.io/projected/196b7a49-c2eb-4e0c-8185-5fb955cdb8eb-kube-api-access-x9frc\") pod \"model-serving-api-86f7b4b499-nxzz9\" (UID: \"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb\") " pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.411146 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.410954 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9xl\" (UniqueName: \"kubernetes.io/projected/16f30508-81c2-4329-9308-c29e56ea7cdb-kube-api-access-cr9xl\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:21.411146 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.410987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16f30508-81c2-4329-9308-c29e56ea7cdb-cert\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:21.411146 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.411008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/196b7a49-c2eb-4e0c-8185-5fb955cdb8eb-tls-certs\") pod \"model-serving-api-86f7b4b499-nxzz9\" (UID: \"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb\") " pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.512666 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.512627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9xl\" (UniqueName: \"kubernetes.io/projected/16f30508-81c2-4329-9308-c29e56ea7cdb-kube-api-access-cr9xl\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:21.512871 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.512680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16f30508-81c2-4329-9308-c29e56ea7cdb-cert\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:21.512871 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.512721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/196b7a49-c2eb-4e0c-8185-5fb955cdb8eb-tls-certs\") pod \"model-serving-api-86f7b4b499-nxzz9\" (UID: \"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb\") " pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.512871 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.512835 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9frc\" (UniqueName: \"kubernetes.io/projected/196b7a49-c2eb-4e0c-8185-5fb955cdb8eb-kube-api-access-x9frc\") pod \"model-serving-api-86f7b4b499-nxzz9\" (UID: \"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb\") " pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.513003 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:23:21.512876 2571 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 18:23:21.513003 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:23:21.512967 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f30508-81c2-4329-9308-c29e56ea7cdb-cert podName:16f30508-81c2-4329-9308-c29e56ea7cdb nodeName:}" failed. No retries permitted until 2026-04-16 18:23:22.012945338 +0000 UTC m=+826.937495969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16f30508-81c2-4329-9308-c29e56ea7cdb-cert") pod "odh-model-controller-696fc77849-zw59l" (UID: "16f30508-81c2-4329-9308-c29e56ea7cdb") : secret "odh-model-controller-webhook-cert" not found Apr 16 18:23:21.515537 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.515513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/196b7a49-c2eb-4e0c-8185-5fb955cdb8eb-tls-certs\") pod \"model-serving-api-86f7b4b499-nxzz9\" (UID: \"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb\") " pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.522128 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.522100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9frc\" (UniqueName: \"kubernetes.io/projected/196b7a49-c2eb-4e0c-8185-5fb955cdb8eb-kube-api-access-x9frc\") pod \"model-serving-api-86f7b4b499-nxzz9\" (UID: \"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb\") " pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.522259 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.522161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9xl\" (UniqueName: \"kubernetes.io/projected/16f30508-81c2-4329-9308-c29e56ea7cdb-kube-api-access-cr9xl\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:21.652046 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.651959 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:21.781970 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:21.781943 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nxzz9"] Apr 16 18:23:21.783406 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:23:21.783376 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod196b7a49_c2eb_4e0c_8185_5fb955cdb8eb.slice/crio-94974436aecb50447f717f0b1e1854416f9758302b209146a886f2c62ed37f42 WatchSource:0}: Error finding container 94974436aecb50447f717f0b1e1854416f9758302b209146a886f2c62ed37f42: Status 404 returned error can't find the container with id 94974436aecb50447f717f0b1e1854416f9758302b209146a886f2c62ed37f42 Apr 16 18:23:22.018378 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:22.018345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16f30508-81c2-4329-9308-c29e56ea7cdb-cert\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:22.020873 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:22.020851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16f30508-81c2-4329-9308-c29e56ea7cdb-cert\") pod \"odh-model-controller-696fc77849-zw59l\" (UID: \"16f30508-81c2-4329-9308-c29e56ea7cdb\") " pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:22.273820 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:22.273736 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:22.431386 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:22.431355 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-zw59l"] Apr 16 18:23:22.433290 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:23:22.433261 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f30508_81c2_4329_9308_c29e56ea7cdb.slice/crio-99ff7d19d7f5df199b98c34aec6173af7b659d188d925fe4c55d3f6e702a471b WatchSource:0}: Error finding container 99ff7d19d7f5df199b98c34aec6173af7b659d188d925fe4c55d3f6e702a471b: Status 404 returned error can't find the container with id 99ff7d19d7f5df199b98c34aec6173af7b659d188d925fe4c55d3f6e702a471b Apr 16 18:23:22.492782 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:22.492739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-zw59l" event={"ID":"16f30508-81c2-4329-9308-c29e56ea7cdb","Type":"ContainerStarted","Data":"99ff7d19d7f5df199b98c34aec6173af7b659d188d925fe4c55d3f6e702a471b"} Apr 16 18:23:22.494063 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:22.494023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nxzz9" event={"ID":"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb","Type":"ContainerStarted","Data":"94974436aecb50447f717f0b1e1854416f9758302b209146a886f2c62ed37f42"} Apr 16 18:23:23.501130 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:23.501091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nxzz9" event={"ID":"196b7a49-c2eb-4e0c-8185-5fb955cdb8eb","Type":"ContainerStarted","Data":"471671eaa783b7d51bababc7b06602f242e0f8ed4a72ca516c52e656f837b1e7"} Apr 16 18:23:23.501603 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:23.501236 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:23.519617 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:23.519533 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-nxzz9" podStartSLOduration=1.548104975 podStartE2EDuration="2.519515155s" podCreationTimestamp="2026-04-16 18:23:21 +0000 UTC" firstStartedPulling="2026-04-16 18:23:21.785144588 +0000 UTC m=+826.709695220" lastFinishedPulling="2026-04-16 18:23:22.75655477 +0000 UTC m=+827.681105400" observedRunningTime="2026-04-16 18:23:23.517440582 +0000 UTC m=+828.441991233" watchObservedRunningTime="2026-04-16 18:23:23.519515155 +0000 UTC m=+828.444065807" Apr 16 18:23:25.511806 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:25.511774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-zw59l" event={"ID":"16f30508-81c2-4329-9308-c29e56ea7cdb","Type":"ContainerStarted","Data":"381d661da010224608f44f287a49bd5a7d78a12d3ed53dd724a0a7747e6dda0c"} Apr 16 18:23:25.512236 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:25.511955 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:23:25.529888 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:25.529832 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-zw59l" podStartSLOduration=2.2032683730000002 podStartE2EDuration="4.529815405s" podCreationTimestamp="2026-04-16 18:23:21 +0000 UTC" firstStartedPulling="2026-04-16 18:23:22.434928603 +0000 UTC m=+827.359479232" lastFinishedPulling="2026-04-16 18:23:24.761475632 +0000 UTC m=+829.686026264" observedRunningTime="2026-04-16 18:23:25.528971079 +0000 UTC m=+830.453521729" watchObservedRunningTime="2026-04-16 18:23:25.529815405 +0000 UTC m=+830.454366057" Apr 16 18:23:34.510668 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:34.510640 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-nxzz9" Apr 16 18:23:36.518550 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:23:36.518520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-zw59l" Apr 16 18:24:08.197795 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.197752 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx"] Apr 16 18:24:08.200376 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.200360 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.203677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.203650 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:24:08.203841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.203703 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 18:24:08.203841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.203779 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:24:08.203841 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.203798 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:24:08.212600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.212577 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx"] Apr 16 18:24:08.231770 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.231732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.231969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.231790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.231969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.231844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-home\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.231969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.231870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lj54\" (UniqueName: \"kubernetes.io/projected/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kube-api-access-9lj54\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.232102 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.231972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-model-cache\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.232102 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.232008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-dshm\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333102 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-home\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lj54\" (UniqueName: \"kubernetes.io/projected/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kube-api-access-9lj54\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-model-cache\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-dshm\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333477 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333535 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-home\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-model-cache\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.333657 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.333624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.335610 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.335583 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-dshm\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.335858 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.335839 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.342583 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.342555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lj54\" (UniqueName: \"kubernetes.io/projected/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kube-api-access-9lj54\") pod \"scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.512078 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.511981 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:08.646535 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.646514 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx"] Apr 16 18:24:08.649169 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:24:08.649138 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f904ad4_a13e_43c1_a9fc_e2aca888f4ef.slice/crio-4c48459572558f850e92b34e5b0f6605eca9ffbbf08bb8dd4a97da82f0583f39 WatchSource:0}: Error finding container 4c48459572558f850e92b34e5b0f6605eca9ffbbf08bb8dd4a97da82f0583f39: Status 404 returned error can't find the container with id 4c48459572558f850e92b34e5b0f6605eca9ffbbf08bb8dd4a97da82f0583f39 Apr 16 18:24:08.689571 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:08.689535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" event={"ID":"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef","Type":"ContainerStarted","Data":"4c48459572558f850e92b34e5b0f6605eca9ffbbf08bb8dd4a97da82f0583f39"} Apr 16 18:24:12.708472 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:12.708431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" event={"ID":"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef","Type":"ContainerStarted","Data":"ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e"} Apr 16 18:24:16.726838 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:16.726802 2571 generic.go:358] "Generic (PLEG): container finished" podID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerID="ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e" exitCode=0 Apr 16 18:24:16.727200 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:16.726874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" event={"ID":"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef","Type":"ContainerDied","Data":"ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e"} Apr 16 18:24:18.738347 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:18.738310 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" event={"ID":"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef","Type":"ContainerStarted","Data":"707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d"} Apr 16 18:24:18.757541 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:18.757459 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" podStartSLOduration=1.499203248 podStartE2EDuration="10.757444842s" podCreationTimestamp="2026-04-16 18:24:08 +0000 UTC" firstStartedPulling="2026-04-16 18:24:08.651323908 +0000 UTC m=+873.575874538" lastFinishedPulling="2026-04-16 18:24:17.909565498 +0000 UTC m=+882.834116132" observedRunningTime="2026-04-16 18:24:18.75718454 +0000 UTC m=+883.681735192" watchObservedRunningTime="2026-04-16 18:24:18.757444842 +0000 UTC m=+883.681995492" Apr 16 18:24:28.513035 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:28.513000 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:28.513529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:28.513172 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:28.525578 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:28.525546 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:28.788940 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:28.788857 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:24:35.546495 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:35.546468 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:24:35.547653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:24:35.547634 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:25:15.160740 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.160703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b"] Apr 16 18:25:15.163633 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.163610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.171317 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.171066 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-g284x\"" Apr 16 18:25:15.171464 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.171363 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 18:25:15.171953 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.171929 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b"] Apr 16 18:25:15.242916 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.242883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.243080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.242921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62fffaef-f54f-481a-b9f3-6f831f44006b-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.243080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.243041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.243170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.243078 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.243170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.243111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.243170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.243128 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbsn9\" (UniqueName: \"kubernetes.io/projected/62fffaef-f54f-481a-b9f3-6f831f44006b-kube-api-access-tbsn9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.343949 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.343911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.343958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbsn9\" (UniqueName: \"kubernetes.io/projected/62fffaef-f54f-481a-b9f3-6f831f44006b-kube-api-access-tbsn9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62fffaef-f54f-481a-b9f3-6f831f44006b-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344412 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344444 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344483 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.344529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.344496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.346618 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.346590 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62fffaef-f54f-481a-b9f3-6f831f44006b-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.355794 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.355726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbsn9\" (UniqueName: \"kubernetes.io/projected/62fffaef-f54f-481a-b9f3-6f831f44006b-kube-api-access-tbsn9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.476743 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.476668 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:15.607157 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:25:15.607125 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62fffaef_f54f_481a_b9f3_6f831f44006b.slice/crio-2c38a5101becbc6919bfd25ac53c7d9aeb9244224b68048429cb0d391dca36fb WatchSource:0}: Error finding container 2c38a5101becbc6919bfd25ac53c7d9aeb9244224b68048429cb0d391dca36fb: Status 404 returned error can't find the container with id 2c38a5101becbc6919bfd25ac53c7d9aeb9244224b68048429cb0d391dca36fb Apr 16 18:25:15.611513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.611488 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b"] Apr 16 18:25:15.967451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.967409 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerStarted","Data":"7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47"} Apr 16 18:25:15.967451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:15.967455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerStarted","Data":"2c38a5101becbc6919bfd25ac53c7d9aeb9244224b68048429cb0d391dca36fb"} Apr 16 18:25:16.973413 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:16.973375 2571 generic.go:358] "Generic (PLEG): container finished" podID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerID="7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47" exitCode=0 Apr 16 18:25:16.973846 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:16.973458 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerDied","Data":"7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47"} Apr 16 18:25:18.984232 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:18.984184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerStarted","Data":"9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc"} Apr 16 18:25:20.406222 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.406189 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx"] Apr 16 18:25:20.407065 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.406535 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerName="main" containerID="cri-o://707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d" gracePeriod=30 Apr 16 18:25:20.687037 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.687009 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:25:20.804494 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.804454 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kserve-provision-location\") pod \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " Apr 16 18:25:20.804678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.804563 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-tls-certs\") pod \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " Apr 16 18:25:20.804678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.804638 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lj54\" (UniqueName: \"kubernetes.io/projected/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kube-api-access-9lj54\") pod \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " Apr 16 18:25:20.804678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.804666 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-home\") pod \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " Apr 16 18:25:20.804915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.804770 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-model-cache\") pod \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " Apr 16 18:25:20.804915 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.804798 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-dshm\") pod \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\" (UID: \"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef\") " Apr 16 18:25:20.805230 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.805168 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-home" (OuterVolumeSpecName: "home") pod "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" (UID: "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:20.805230 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.805211 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-model-cache" (OuterVolumeSpecName: "model-cache") pod "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" (UID: "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:20.807604 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.807560 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" (UID: "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:25:20.808425 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.808277 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kube-api-access-9lj54" (OuterVolumeSpecName: "kube-api-access-9lj54") pod "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" (UID: "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef"). InnerVolumeSpecName "kube-api-access-9lj54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:25:20.809261 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.809223 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-dshm" (OuterVolumeSpecName: "dshm") pod "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" (UID: "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:20.871534 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.871486 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" (UID: "4f904ad4-a13e-43c1-a9fc-e2aca888f4ef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:25:20.906202 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.906150 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:25:20.906202 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.906190 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:25:20.906202 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.906206 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9lj54\" (UniqueName: \"kubernetes.io/projected/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-kube-api-access-9lj54\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:25:20.906524 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.906216 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-home\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:25:20.906524 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.906225 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-model-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:25:20.906524 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.906236 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef-dshm\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:25:20.998269 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.998235 2571 generic.go:358] "Generic (PLEG): container finished" podID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerID="707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d" exitCode=0 Apr 16 18:25:20.998459 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.998316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" event={"ID":"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef","Type":"ContainerDied","Data":"707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d"} Apr 16 18:25:20.998459 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.998323 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" Apr 16 18:25:20.998459 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.998352 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx" event={"ID":"4f904ad4-a13e-43c1-a9fc-e2aca888f4ef","Type":"ContainerDied","Data":"4c48459572558f850e92b34e5b0f6605eca9ffbbf08bb8dd4a97da82f0583f39"} Apr 16 18:25:20.998459 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:20.998373 2571 scope.go:117] "RemoveContainer" containerID="707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d" Apr 16 18:25:21.009523 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.009504 2571 scope.go:117] "RemoveContainer" containerID="ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e" Apr 16 18:25:21.023033 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.023013 2571 scope.go:117] "RemoveContainer" containerID="707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d" Apr 16 18:25:21.023366 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:25:21.023342 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d\": container with ID starting with 707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d not found: ID does not exist" containerID="707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d" Apr 16 18:25:21.023559 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.023374 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d"} err="failed to get container status \"707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d\": rpc error: code = NotFound desc = could not find container \"707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d\": container with ID starting with 707223c9419a74fb958238a2ae9256c3013b32e886ac5d9bcb60e1e26cc3631d not found: ID does not exist" Apr 16 18:25:21.023559 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.023400 2571 scope.go:117] "RemoveContainer" containerID="ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e" Apr 16 18:25:21.023817 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:25:21.023792 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e\": container with ID starting with ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e not found: ID does not exist" containerID="ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e" Apr 16 18:25:21.023917 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.023826 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e"} err="failed to get container status \"ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e\": rpc error: code = NotFound desc = could not find container \"ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e\": container with ID starting with ba1c2ecb9cddc64db204b3aba3b930910ef4a5ad2fdd5e00cc53476011e6450e not found: ID does not exist" Apr 16 18:25:21.046462 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.046434 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx"] Apr 16 18:25:21.050796 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.050770 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-866c6b5cf-qsqbx"] Apr 16 18:25:21.643705 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:21.643662 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" path="/var/lib/kubelet/pods/4f904ad4-a13e-43c1-a9fc-e2aca888f4ef/volumes" Apr 16 18:25:49.131051 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:49.131000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerStarted","Data":"9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e"} Apr 16 18:25:49.131524 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:49.131157 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:49.134034 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:49.134014 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:49.165369 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:49.165322 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" podStartSLOduration=2.465791531 podStartE2EDuration="34.165309548s" podCreationTimestamp="2026-04-16 18:25:15 +0000 UTC" firstStartedPulling="2026-04-16 18:25:16.974634932 +0000 UTC m=+941.899185562" lastFinishedPulling="2026-04-16 18:25:48.67415295 +0000 UTC m=+973.598703579" observedRunningTime="2026-04-16 18:25:49.161489956 +0000 UTC m=+974.086040607" watchObservedRunningTime="2026-04-16 18:25:49.165309548 +0000 UTC m=+974.089860199" Apr 16 18:25:55.476821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:55.476781 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:55.476821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:55.476828 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:25:55.477418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:25:55.477158 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.54:8082/healthz\": dial tcp 10.132.0.54:8082: connect: connection refused" Apr 16 18:26:05.478673 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:05.478637 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:26:05.479791 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:05.479772 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:26:24.509005 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.508975 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8"] Apr 16 18:26:24.509489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.509403 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerName="main" Apr 16 18:26:24.509489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.509416 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerName="main" Apr 16 18:26:24.509489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.509425 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerName="storage-initializer" Apr 16 18:26:24.509489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.509431 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerName="storage-initializer" Apr 16 18:26:24.509489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.509490 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f904ad4-a13e-43c1-a9fc-e2aca888f4ef" containerName="main" Apr 16 18:26:24.530231 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.530194 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8"] Apr 16 18:26:24.530389 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.530364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.533435 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.533401 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 18:26:24.689187 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.689143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd3bd6f1-31d4-482e-a0b5-266a54277afc-tls-certs\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.689384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.689200 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg84w\" (UniqueName: \"kubernetes.io/projected/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kube-api-access-hg84w\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.689384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.689262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.689384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.689322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-model-cache\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.689384 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.689363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-dshm\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.689560 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.689459 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-home\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791071 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.790985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd3bd6f1-31d4-482e-a0b5-266a54277afc-tls-certs\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791071 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg84w\" (UniqueName: \"kubernetes.io/projected/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kube-api-access-hg84w\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791071 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-model-cache\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-dshm\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791352 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-home\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791522 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-model-cache\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791582 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.791650 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.791628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-home\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.793406 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.793388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-dshm\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.793502 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.793484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd3bd6f1-31d4-482e-a0b5-266a54277afc-tls-certs\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.812225 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.812192 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg84w\" (UniqueName: \"kubernetes.io/projected/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kube-api-access-hg84w\") pod \"precise-prefix-cache-test-kserve-bc6465d9f-zbng8\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.841713 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:24.841646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:24.998568 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:26:24.998531 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd3bd6f1_31d4_482e_a0b5_266a54277afc.slice/crio-7bfa1dd1b17c878bfa5e4b80f54160d84fb684d4991dc9b14a907e3b8e52bfc7 WatchSource:0}: Error finding container 7bfa1dd1b17c878bfa5e4b80f54160d84fb684d4991dc9b14a907e3b8e52bfc7: Status 404 returned error can't find the container with id 7bfa1dd1b17c878bfa5e4b80f54160d84fb684d4991dc9b14a907e3b8e52bfc7 Apr 16 18:26:25.014750 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:25.014719 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8"] Apr 16 18:26:25.282322 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:25.282287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" event={"ID":"bd3bd6f1-31d4-482e-a0b5-266a54277afc","Type":"ContainerStarted","Data":"320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e"} Apr 16 18:26:25.282322 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:25.282324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" event={"ID":"bd3bd6f1-31d4-482e-a0b5-266a54277afc","Type":"ContainerStarted","Data":"7bfa1dd1b17c878bfa5e4b80f54160d84fb684d4991dc9b14a907e3b8e52bfc7"} Apr 16 18:26:29.301356 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:29.301318 2571 generic.go:358] "Generic (PLEG): container finished" podID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerID="320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e" exitCode=0 Apr 16 18:26:29.301767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:29.301391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" event={"ID":"bd3bd6f1-31d4-482e-a0b5-266a54277afc","Type":"ContainerDied","Data":"320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e"} Apr 16 18:26:30.307337 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:30.307301 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" event={"ID":"bd3bd6f1-31d4-482e-a0b5-266a54277afc","Type":"ContainerStarted","Data":"122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1"} Apr 16 18:26:30.327969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:30.327921 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" podStartSLOduration=6.327906502 podStartE2EDuration="6.327906502s" podCreationTimestamp="2026-04-16 18:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:30.326139382 +0000 UTC m=+1015.250690033" watchObservedRunningTime="2026-04-16 18:26:30.327906502 +0000 UTC m=+1015.252457153" Apr 16 18:26:34.842403 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:34.842359 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:34.842802 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:34.842415 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:34.855148 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:34.855124 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:35.337818 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:35.337789 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:57.587537 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:57.587501 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8"] Apr 16 18:26:57.588158 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:57.587812 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerName="main" containerID="cri-o://122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1" gracePeriod=30 Apr 16 18:26:57.838881 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:57.838815 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:58.004811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.004781 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg84w\" (UniqueName: \"kubernetes.io/projected/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kube-api-access-hg84w\") pod \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " Apr 16 18:26:58.004811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.004818 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-home\") pod \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " Apr 16 18:26:58.005028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.004849 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-model-cache\") pod \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " Apr 16 18:26:58.005028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.004873 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd3bd6f1-31d4-482e-a0b5-266a54277afc-tls-certs\") pod \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " Apr 16 18:26:58.005028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.004895 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-dshm\") pod \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " Apr 16 18:26:58.005028 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.004923 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kserve-provision-location\") pod \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\" (UID: \"bd3bd6f1-31d4-482e-a0b5-266a54277afc\") " Apr 16 18:26:58.005241 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.005101 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-home" (OuterVolumeSpecName: "home") pod "bd3bd6f1-31d4-482e-a0b5-266a54277afc" (UID: "bd3bd6f1-31d4-482e-a0b5-266a54277afc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:58.005302 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.005208 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-model-cache" (OuterVolumeSpecName: "model-cache") pod "bd3bd6f1-31d4-482e-a0b5-266a54277afc" (UID: "bd3bd6f1-31d4-482e-a0b5-266a54277afc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:58.005302 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.005288 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-home\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.007054 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.007021 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3bd6f1-31d4-482e-a0b5-266a54277afc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bd3bd6f1-31d4-482e-a0b5-266a54277afc" (UID: "bd3bd6f1-31d4-482e-a0b5-266a54277afc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:58.007186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.007063 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-dshm" (OuterVolumeSpecName: "dshm") pod "bd3bd6f1-31d4-482e-a0b5-266a54277afc" (UID: "bd3bd6f1-31d4-482e-a0b5-266a54277afc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:58.007186 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.007127 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kube-api-access-hg84w" (OuterVolumeSpecName: "kube-api-access-hg84w") pod "bd3bd6f1-31d4-482e-a0b5-266a54277afc" (UID: "bd3bd6f1-31d4-482e-a0b5-266a54277afc"). InnerVolumeSpecName "kube-api-access-hg84w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:58.070618 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.070558 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd3bd6f1-31d4-482e-a0b5-266a54277afc" (UID: "bd3bd6f1-31d4-482e-a0b5-266a54277afc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:58.106086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.106005 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.106086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.106044 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hg84w\" (UniqueName: \"kubernetes.io/projected/bd3bd6f1-31d4-482e-a0b5-266a54277afc-kube-api-access-hg84w\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.106086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.106056 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-model-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.106086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.106065 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd3bd6f1-31d4-482e-a0b5-266a54277afc-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.106086 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.106076 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd3bd6f1-31d4-482e-a0b5-266a54277afc-dshm\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:26:58.430781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.430679 2571 generic.go:358] "Generic (PLEG): container finished" podID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerID="122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1" exitCode=0 Apr 16 18:26:58.430781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.430722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" event={"ID":"bd3bd6f1-31d4-482e-a0b5-266a54277afc","Type":"ContainerDied","Data":"122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1"} Apr 16 18:26:58.430781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.430760 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" Apr 16 18:26:58.430781 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.430773 2571 scope.go:117] "RemoveContainer" containerID="122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1" Apr 16 18:26:58.431059 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.430763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8" event={"ID":"bd3bd6f1-31d4-482e-a0b5-266a54277afc","Type":"ContainerDied","Data":"7bfa1dd1b17c878bfa5e4b80f54160d84fb684d4991dc9b14a907e3b8e52bfc7"} Apr 16 18:26:58.448128 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.448108 2571 scope.go:117] "RemoveContainer" containerID="320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e" Apr 16 18:26:58.462958 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.461446 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8"] Apr 16 18:26:58.462958 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.462760 2571 scope.go:117] "RemoveContainer" containerID="122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1" Apr 16 18:26:58.463405 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:26:58.463381 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1\": container with ID starting with 122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1 not found: ID does not exist" containerID="122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1" Apr 16 18:26:58.463529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.463413 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1"} err="failed to get container status \"122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1\": rpc error: code = NotFound desc = could not find container \"122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1\": container with ID starting with 122a2f06a6820f3c99c85d35a99ade57c911e377da59262bdde7e65530bf2ad1 not found: ID does not exist" Apr 16 18:26:58.463529 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.463432 2571 scope.go:117] "RemoveContainer" containerID="320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e" Apr 16 18:26:58.463754 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:26:58.463736 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e\": container with ID starting with 320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e not found: ID does not exist" containerID="320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e" Apr 16 18:26:58.463817 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.463761 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e"} err="failed to get container status \"320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e\": rpc error: code = NotFound desc = could not find container \"320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e\": container with ID starting with 320be49cdd7f423da3329111e43d8a9cd5ebbd8847448e5795cde9f074e5bb6e not found: ID does not exist" Apr 16 18:26:58.468506 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:58.468482 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-bc6465d9f-zbng8"] Apr 16 18:26:59.641507 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:26:59.641476 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" path="/var/lib/kubelet/pods/bd3bd6f1-31d4-482e-a0b5-266a54277afc/volumes" Apr 16 18:27:56.106475 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:56.106435 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b"] Apr 16 18:27:56.107347 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:56.106784 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="main" containerID="cri-o://9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc" gracePeriod=30 Apr 16 18:27:56.107347 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:56.106869 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="tokenizer" containerID="cri-o://9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e" gracePeriod=30 Apr 16 18:27:56.663837 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:56.663802 2571 generic.go:358] "Generic (PLEG): container finished" podID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerID="9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc" exitCode=0 Apr 16 18:27:56.664055 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:56.663886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerDied","Data":"9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc"} Apr 16 18:27:57.465496 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.465472 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:27:57.643127 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643052 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-uds\") pod \"62fffaef-f54f-481a-b9f3-6f831f44006b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " Apr 16 18:27:57.643127 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643112 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-tmp\") pod \"62fffaef-f54f-481a-b9f3-6f831f44006b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " Apr 16 18:27:57.643385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643140 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbsn9\" (UniqueName: \"kubernetes.io/projected/62fffaef-f54f-481a-b9f3-6f831f44006b-kube-api-access-tbsn9\") pod \"62fffaef-f54f-481a-b9f3-6f831f44006b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " Apr 16 18:27:57.643385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643188 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-kserve-provision-location\") pod \"62fffaef-f54f-481a-b9f3-6f831f44006b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " Apr 16 18:27:57.643385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643229 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-cache\") pod \"62fffaef-f54f-481a-b9f3-6f831f44006b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " Apr 16 18:27:57.643385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643291 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62fffaef-f54f-481a-b9f3-6f831f44006b-tls-certs\") pod \"62fffaef-f54f-481a-b9f3-6f831f44006b\" (UID: \"62fffaef-f54f-481a-b9f3-6f831f44006b\") " Apr 16 18:27:57.643385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643349 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "62fffaef-f54f-481a-b9f3-6f831f44006b" (UID: "62fffaef-f54f-481a-b9f3-6f831f44006b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:27:57.643654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643464 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "62fffaef-f54f-481a-b9f3-6f831f44006b" (UID: "62fffaef-f54f-481a-b9f3-6f831f44006b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:27:57.643654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643514 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "62fffaef-f54f-481a-b9f3-6f831f44006b" (UID: "62fffaef-f54f-481a-b9f3-6f831f44006b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:27:57.643654 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643565 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-uds\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:27:57.643918 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.643893 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "62fffaef-f54f-481a-b9f3-6f831f44006b" (UID: "62fffaef-f54f-481a-b9f3-6f831f44006b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:27:57.645287 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.645268 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fffaef-f54f-481a-b9f3-6f831f44006b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "62fffaef-f54f-481a-b9f3-6f831f44006b" (UID: "62fffaef-f54f-481a-b9f3-6f831f44006b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:27:57.645403 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.645386 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fffaef-f54f-481a-b9f3-6f831f44006b-kube-api-access-tbsn9" (OuterVolumeSpecName: "kube-api-access-tbsn9") pod "62fffaef-f54f-481a-b9f3-6f831f44006b" (UID: "62fffaef-f54f-481a-b9f3-6f831f44006b"). InnerVolumeSpecName "kube-api-access-tbsn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:27:57.669342 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.669306 2571 generic.go:358] "Generic (PLEG): container finished" podID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerID="9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e" exitCode=0 Apr 16 18:27:57.669489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.669392 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" Apr 16 18:27:57.669489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.669390 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerDied","Data":"9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e"} Apr 16 18:27:57.669489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.669437 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b" event={"ID":"62fffaef-f54f-481a-b9f3-6f831f44006b","Type":"ContainerDied","Data":"2c38a5101becbc6919bfd25ac53c7d9aeb9244224b68048429cb0d391dca36fb"} Apr 16 18:27:57.669489 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.669453 2571 scope.go:117] "RemoveContainer" containerID="9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e" Apr 16 18:27:57.678817 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.678800 2571 scope.go:117] "RemoveContainer" containerID="9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc" Apr 16 18:27:57.686660 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.686642 2571 scope.go:117] "RemoveContainer" containerID="7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47" Apr 16 18:27:57.692341 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.692317 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b"] Apr 16 18:27:57.695398 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.695379 2571 scope.go:117] "RemoveContainer" containerID="9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e" Apr 16 18:27:57.695745 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:27:57.695722 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e\": container with ID starting with 9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e not found: ID does not exist" containerID="9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e" Apr 16 18:27:57.695833 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.695758 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e"} err="failed to get container status \"9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e\": rpc error: code = NotFound desc = could not find container \"9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e\": container with ID starting with 9434c3eb746df54f2027181fdc2a982e04c4e14b53211fa5e2e23064b38be06e not found: ID does not exist" Apr 16 18:27:57.695833 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.695786 2571 scope.go:117] "RemoveContainer" containerID="9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc" Apr 16 18:27:57.696096 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:27:57.696070 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc\": container with ID starting with 9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc not found: ID does not exist" containerID="9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc" Apr 16 18:27:57.696193 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.696100 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc"} err="failed to get container status \"9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc\": rpc error: code = NotFound desc = could not find container \"9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc\": container with ID starting with 9bc4b2b2422463f9c83a462bda5a2e660ea0d2e037a8f9d973bbd95bc1c39fcc not found: ID does not exist" Apr 16 18:27:57.696193 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.696123 2571 scope.go:117] "RemoveContainer" containerID="7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47" Apr 16 18:27:57.696363 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:27:57.696349 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47\": container with ID starting with 7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47 not found: ID does not exist" containerID="7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47" Apr 16 18:27:57.696448 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.696367 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47"} err="failed to get container status \"7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47\": rpc error: code = NotFound desc = could not find container \"7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47\": container with ID starting with 7accfaaa903d6ed6e8caab85fcf25a262c6997919d67db9aa98bcf98b5f9bc47 not found: ID does not exist" Apr 16 18:27:57.696929 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.696911 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqhj7b"] Apr 16 18:27:57.744413 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.744358 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:27:57.744413 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.744392 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:27:57.744413 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.744408 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62fffaef-f54f-481a-b9f3-6f831f44006b-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:27:57.744413 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.744423 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62fffaef-f54f-481a-b9f3-6f831f44006b-tokenizer-tmp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:27:57.744723 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:57.744439 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbsn9\" (UniqueName: \"kubernetes.io/projected/62fffaef-f54f-481a-b9f3-6f831f44006b-kube-api-access-tbsn9\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:27:59.641543 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:27:59.641510 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" path="/var/lib/kubelet/pods/62fffaef-f54f-481a-b9f3-6f831f44006b/volumes" Apr 16 18:28:08.361493 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361457 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd"] Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361874 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="main" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361886 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="main" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361900 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="tokenizer" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361905 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="tokenizer" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361912 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerName="storage-initializer" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361918 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerName="storage-initializer" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361929 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="storage-initializer" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361934 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="storage-initializer" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361944 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerName="main" Apr 16 18:28:08.361959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.361949 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerName="main" Apr 16 18:28:08.362267 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.362012 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="tokenizer" Apr 16 18:28:08.362267 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.362022 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fffaef-f54f-481a-b9f3-6f831f44006b" containerName="main" Apr 16 18:28:08.362267 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.362030 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd3bd6f1-31d4-482e-a0b5-266a54277afc" containerName="main" Apr 16 18:28:08.367401 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.367379 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.370466 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.370443 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:28:08.370599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.370443 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 18:28:08.370599 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.370585 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:28:08.370732 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.370460 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:28:08.370845 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.370822 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-4n7dq\"" Apr 16 18:28:08.376110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.376082 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd"] Apr 16 18:28:08.431862 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.431817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.432039 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.431872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a0a7bd-1321-4622-b250-890491a2cb89-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.432039 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.431894 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.432039 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.431976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.432039 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.432023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjdx\" (UniqueName: \"kubernetes.io/projected/f3a0a7bd-1321-4622-b250-890491a2cb89-kube-api-access-zdjdx\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.432193 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.432057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.532766 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.532678 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjdx\" (UniqueName: \"kubernetes.io/projected/f3a0a7bd-1321-4622-b250-890491a2cb89-kube-api-access-zdjdx\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.532988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.532794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.532988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.532830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.532988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.532858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a0a7bd-1321-4622-b250-890491a2cb89-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.532988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.532876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.532988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.532931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.533308 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.533283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.533388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.533302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.533388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.533379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.533479 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.533385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.535992 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.535969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a0a7bd-1321-4622-b250-890491a2cb89-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.542678 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.542644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjdx\" (UniqueName: \"kubernetes.io/projected/f3a0a7bd-1321-4622-b250-890491a2cb89-kube-api-access-zdjdx\") pod \"custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:08.679303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:08.679208 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:09.019363 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:09.019335 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd"] Apr 16 18:28:09.020393 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:28:09.020365 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a0a7bd_1321_4622_b250_890491a2cb89.slice/crio-99263c974887f87644c23adb042cd9fba587b459ad98ae0dc2ba5a488f5adbd2 WatchSource:0}: Error finding container 99263c974887f87644c23adb042cd9fba587b459ad98ae0dc2ba5a488f5adbd2: Status 404 returned error can't find the container with id 99263c974887f87644c23adb042cd9fba587b459ad98ae0dc2ba5a488f5adbd2 Apr 16 18:28:09.022339 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:09.022322 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:09.717905 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:09.717869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerStarted","Data":"8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943"} Apr 16 18:28:09.717905 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:09.717904 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerStarted","Data":"99263c974887f87644c23adb042cd9fba587b459ad98ae0dc2ba5a488f5adbd2"} Apr 16 18:28:10.723513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:10.723477 2571 generic.go:358] "Generic (PLEG): container finished" podID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerID="8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943" exitCode=0 Apr 16 18:28:10.723924 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:10.723560 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerDied","Data":"8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943"} Apr 16 18:28:11.730540 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:11.730504 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerStarted","Data":"47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b"} Apr 16 18:28:11.730936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:11.730546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerStarted","Data":"424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe"} Apr 16 18:28:11.730936 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:11.730709 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:11.756020 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:11.755961 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" podStartSLOduration=3.755943319 podStartE2EDuration="3.755943319s" podCreationTimestamp="2026-04-16 18:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:11.753543019 +0000 UTC m=+1116.678093670" watchObservedRunningTime="2026-04-16 18:28:11.755943319 +0000 UTC m=+1116.680493972" Apr 16 18:28:18.679425 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:18.679387 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:18.680050 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:18.679440 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:18.682289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:18.682267 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:18.762568 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:18.762540 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:28:49.766755 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:28:49.766726 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:29:35.579721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:29:35.579680 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:29:35.582534 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:29:35.582512 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:30:36.370123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:36.370046 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd"] Apr 16 18:30:36.370592 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:36.370362 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="main" containerID="cri-o://424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe" gracePeriod=30 Apr 16 18:30:36.370592 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:36.370428 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="tokenizer" containerID="cri-o://47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b" gracePeriod=30 Apr 16 18:30:37.325017 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.324981 2571 generic.go:358] "Generic (PLEG): container finished" podID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerID="424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe" exitCode=0 Apr 16 18:30:37.325202 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.325064 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerDied","Data":"424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe"} Apr 16 18:30:37.629027 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.629007 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:30:37.759847 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.759817 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a0a7bd-1321-4622-b250-890491a2cb89-tls-certs\") pod \"f3a0a7bd-1321-4622-b250-890491a2cb89\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " Apr 16 18:30:37.760052 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.759868 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-kserve-provision-location\") pod \"f3a0a7bd-1321-4622-b250-890491a2cb89\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " Apr 16 18:30:37.760052 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760001 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-tmp\") pod \"f3a0a7bd-1321-4622-b250-890491a2cb89\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " Apr 16 18:30:37.760175 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760053 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-uds\") pod \"f3a0a7bd-1321-4622-b250-890491a2cb89\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " Apr 16 18:30:37.760175 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760146 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-cache\") pod \"f3a0a7bd-1321-4622-b250-890491a2cb89\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " Apr 16 18:30:37.760282 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760221 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdjdx\" (UniqueName: \"kubernetes.io/projected/f3a0a7bd-1321-4622-b250-890491a2cb89-kube-api-access-zdjdx\") pod \"f3a0a7bd-1321-4622-b250-890491a2cb89\" (UID: \"f3a0a7bd-1321-4622-b250-890491a2cb89\") " Apr 16 18:30:37.760332 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760293 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f3a0a7bd-1321-4622-b250-890491a2cb89" (UID: "f3a0a7bd-1321-4622-b250-890491a2cb89"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:37.760332 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760310 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f3a0a7bd-1321-4622-b250-890491a2cb89" (UID: "f3a0a7bd-1321-4622-b250-890491a2cb89"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:37.760460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760433 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f3a0a7bd-1321-4622-b250-890491a2cb89" (UID: "f3a0a7bd-1321-4622-b250-890491a2cb89"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:37.760592 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760576 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-tmp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:30:37.760680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760599 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-uds\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:30:37.760680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760613 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-tokenizer-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:30:37.760849 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.760828 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f3a0a7bd-1321-4622-b250-890491a2cb89" (UID: "f3a0a7bd-1321-4622-b250-890491a2cb89"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:37.762290 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.762267 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a0a7bd-1321-4622-b250-890491a2cb89-kube-api-access-zdjdx" (OuterVolumeSpecName: "kube-api-access-zdjdx") pod "f3a0a7bd-1321-4622-b250-890491a2cb89" (UID: "f3a0a7bd-1321-4622-b250-890491a2cb89"). InnerVolumeSpecName "kube-api-access-zdjdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:30:37.762392 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.762299 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a0a7bd-1321-4622-b250-890491a2cb89-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f3a0a7bd-1321-4622-b250-890491a2cb89" (UID: "f3a0a7bd-1321-4622-b250-890491a2cb89"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:30:37.861606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.861504 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdjdx\" (UniqueName: \"kubernetes.io/projected/f3a0a7bd-1321-4622-b250-890491a2cb89-kube-api-access-zdjdx\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:30:37.861606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.861546 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a0a7bd-1321-4622-b250-890491a2cb89-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:30:37.861606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:37.861557 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3a0a7bd-1321-4622-b250-890491a2cb89-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:30:38.331443 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.331407 2571 generic.go:358] "Generic (PLEG): container finished" podID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerID="47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b" exitCode=0 Apr 16 18:30:38.331640 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.331458 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerDied","Data":"47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b"} Apr 16 18:30:38.331640 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.331482 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" Apr 16 18:30:38.331640 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.331491 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd" event={"ID":"f3a0a7bd-1321-4622-b250-890491a2cb89","Type":"ContainerDied","Data":"99263c974887f87644c23adb042cd9fba587b459ad98ae0dc2ba5a488f5adbd2"} Apr 16 18:30:38.331640 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.331511 2571 scope.go:117] "RemoveContainer" containerID="47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b" Apr 16 18:30:38.341228 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.341208 2571 scope.go:117] "RemoveContainer" containerID="424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe" Apr 16 18:30:38.348800 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.348777 2571 scope.go:117] "RemoveContainer" containerID="8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943" Apr 16 18:30:38.356282 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.356267 2571 scope.go:117] "RemoveContainer" containerID="47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b" Apr 16 18:30:38.356532 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:30:38.356514 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b\": container with ID starting with 47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b not found: ID does not exist" containerID="47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b" Apr 16 18:30:38.356614 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.356544 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b"} err="failed to get container status \"47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b\": rpc error: code = NotFound desc = could not find container \"47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b\": container with ID starting with 47afef79407af4da4fbb18c0c26c1353599b9d8a0e048548629b86daec7f524b not found: ID does not exist" Apr 16 18:30:38.356614 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.356571 2571 scope.go:117] "RemoveContainer" containerID="424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe" Apr 16 18:30:38.356826 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:30:38.356811 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe\": container with ID starting with 424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe not found: ID does not exist" containerID="424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe" Apr 16 18:30:38.356894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.356834 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe"} err="failed to get container status \"424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe\": rpc error: code = NotFound desc = could not find container \"424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe\": container with ID starting with 424eb3fcec69ff61a713c0f5bb5c2c4526759f3b5917c5055a6e831de86cedfe not found: ID does not exist" Apr 16 18:30:38.356894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.356853 2571 scope.go:117] "RemoveContainer" containerID="8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943" Apr 16 18:30:38.357110 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:30:38.357092 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943\": container with ID starting with 8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943 not found: ID does not exist" containerID="8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943" Apr 16 18:30:38.357153 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.357116 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943"} err="failed to get container status \"8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943\": rpc error: code = NotFound desc = could not find container \"8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943\": container with ID starting with 8e969d47168b704dbcc1f0cf4e350b950646b276937dc68fa47ba9fa7c699943 not found: ID does not exist" Apr 16 18:30:38.362633 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.362608 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd"] Apr 16 18:30:38.373610 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:38.373579 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-59866987cmkjd"] Apr 16 18:30:39.642193 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:30:39.642162 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" path="/var/lib/kubelet/pods/f3a0a7bd-1321-4622-b250-890491a2cb89/volumes" Apr 16 18:31:03.486957 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.486918 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx"] Apr 16 18:31:03.487463 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487363 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="storage-initializer" Apr 16 18:31:03.487463 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487376 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="storage-initializer" Apr 16 18:31:03.487463 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487398 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="main" Apr 16 18:31:03.487463 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487404 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="main" Apr 16 18:31:03.487463 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487411 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="tokenizer" Apr 16 18:31:03.487463 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487417 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="tokenizer" Apr 16 18:31:03.487661 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487486 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="main" Apr 16 18:31:03.487661 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.487494 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3a0a7bd-1321-4622-b250-890491a2cb89" containerName="tokenizer" Apr 16 18:31:03.490872 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.490854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.493243 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.493213 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:31:03.493243 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.493218 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:31:03.493435 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.493217 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:31:03.493871 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.493856 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-7j2sd\"" Apr 16 18:31:03.493923 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.493857 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 18:31:03.504049 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.504019 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx"] Apr 16 18:31:03.603653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.603613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.603875 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.603659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.603875 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.603711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2874\" (UniqueName: \"kubernetes.io/projected/5b9cecef-f242-495d-be32-0942e8aca2bb-kube-api-access-d2874\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.603875 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.603765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b9cecef-f242-495d-be32-0942e8aca2bb-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.603875 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.603805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.604035 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.603878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.704571 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.704534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.704811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.704597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.704811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.704626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.704811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.704645 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2874\" (UniqueName: \"kubernetes.io/projected/5b9cecef-f242-495d-be32-0942e8aca2bb-kube-api-access-d2874\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.704811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.704669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b9cecef-f242-495d-be32-0942e8aca2bb-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.704811 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.704718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.705093 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.705049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.705093 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.705048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.705179 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.705107 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.705179 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.705152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.707125 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.707104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b9cecef-f242-495d-be32-0942e8aca2bb-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.714932 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.714901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2874\" (UniqueName: \"kubernetes.io/projected/5b9cecef-f242-495d-be32-0942e8aca2bb-kube-api-access-d2874\") pod \"router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.800827 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.800732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:03.941738 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:03.941705 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx"] Apr 16 18:31:03.943572 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:31:03.943546 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9cecef_f242_495d_be32_0942e8aca2bb.slice/crio-de5906db8d0df9a0ae7561ab796de01925eada4a23120150d2924cf0f8408a8a WatchSource:0}: Error finding container de5906db8d0df9a0ae7561ab796de01925eada4a23120150d2924cf0f8408a8a: Status 404 returned error can't find the container with id de5906db8d0df9a0ae7561ab796de01925eada4a23120150d2924cf0f8408a8a Apr 16 18:31:04.441072 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:04.441030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerStarted","Data":"ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961"} Apr 16 18:31:04.441072 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:04.441072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerStarted","Data":"de5906db8d0df9a0ae7561ab796de01925eada4a23120150d2924cf0f8408a8a"} Apr 16 18:31:05.446741 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:05.446671 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerID="ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961" exitCode=0 Apr 16 18:31:05.447124 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:05.446759 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerDied","Data":"ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961"} Apr 16 18:31:06.453427 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:06.453382 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerStarted","Data":"b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314"} Apr 16 18:31:06.453427 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:06.453428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerStarted","Data":"3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144"} Apr 16 18:31:06.453866 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:06.453499 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:06.486256 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:06.486198 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" podStartSLOduration=3.486176668 podStartE2EDuration="3.486176668s" podCreationTimestamp="2026-04-16 18:31:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:06.484779033 +0000 UTC m=+1291.409329684" watchObservedRunningTime="2026-04-16 18:31:06.486176668 +0000 UTC m=+1291.410727321" Apr 16 18:31:13.801558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:13.801517 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:13.801558 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:13.801565 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:13.804274 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:13.804249 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:14.492666 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:14.492637 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:31:35.497227 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:31:35.497199 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:33:02.451938 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:02.451893 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx"] Apr 16 18:33:02.452525 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:02.452223 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="main" containerID="cri-o://3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144" gracePeriod=30 Apr 16 18:33:02.452525 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:02.452276 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="tokenizer" containerID="cri-o://b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314" gracePeriod=30 Apr 16 18:33:02.942680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:02.942591 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerID="3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144" exitCode=0 Apr 16 18:33:02.942680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:02.942631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerDied","Data":"3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144"} Apr 16 18:33:03.817662 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.817639 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:33:03.947997 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.947959 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerID="b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314" exitCode=0 Apr 16 18:33:03.948191 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.948002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerDied","Data":"b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314"} Apr 16 18:33:03.948191 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.948044 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" Apr 16 18:33:03.948191 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.948050 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx" event={"ID":"5b9cecef-f242-495d-be32-0942e8aca2bb","Type":"ContainerDied","Data":"de5906db8d0df9a0ae7561ab796de01925eada4a23120150d2924cf0f8408a8a"} Apr 16 18:33:03.948191 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.948068 2571 scope.go:117] "RemoveContainer" containerID="b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314" Apr 16 18:33:03.957206 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.957186 2571 scope.go:117] "RemoveContainer" containerID="3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144" Apr 16 18:33:03.965606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.965581 2571 scope.go:117] "RemoveContainer" containerID="ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961" Apr 16 18:33:03.973810 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.973780 2571 scope.go:117] "RemoveContainer" containerID="b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314" Apr 16 18:33:03.974097 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:33:03.974077 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314\": container with ID starting with b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314 not found: ID does not exist" containerID="b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314" Apr 16 18:33:03.974149 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.974108 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314"} err="failed to get container status \"b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314\": rpc error: code = NotFound desc = could not find container \"b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314\": container with ID starting with b7bd6f1a160f9062f1515175c991d6e0a4882b46ab81ccf4bdae48bea3583314 not found: ID does not exist" Apr 16 18:33:03.974149 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.974126 2571 scope.go:117] "RemoveContainer" containerID="3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144" Apr 16 18:33:03.974367 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:33:03.974351 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144\": container with ID starting with 3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144 not found: ID does not exist" containerID="3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144" Apr 16 18:33:03.974418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.974378 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144"} err="failed to get container status \"3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144\": rpc error: code = NotFound desc = could not find container \"3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144\": container with ID starting with 3ea5de77a4741690e6618790649cb779f414a2d55fbce9056b9a251c10ef2144 not found: ID does not exist" Apr 16 18:33:03.974418 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.974395 2571 scope.go:117] "RemoveContainer" containerID="ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961" Apr 16 18:33:03.974597 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:33:03.974581 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961\": container with ID starting with ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961 not found: ID does not exist" containerID="ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961" Apr 16 18:33:03.974639 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.974599 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961"} err="failed to get container status \"ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961\": rpc error: code = NotFound desc = could not find container \"ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961\": container with ID starting with ee6aa6b31fd47d5707362af4532bf431c0a5c16e2b62c860a6b1b76b7e7e4961 not found: ID does not exist" Apr 16 18:33:03.982535 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982507 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b9cecef-f242-495d-be32-0942e8aca2bb-tls-certs\") pod \"5b9cecef-f242-495d-be32-0942e8aca2bb\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " Apr 16 18:33:03.982677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982546 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2874\" (UniqueName: \"kubernetes.io/projected/5b9cecef-f242-495d-be32-0942e8aca2bb-kube-api-access-d2874\") pod \"5b9cecef-f242-495d-be32-0942e8aca2bb\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " Apr 16 18:33:03.982677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982594 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-uds\") pod \"5b9cecef-f242-495d-be32-0942e8aca2bb\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " Apr 16 18:33:03.982677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982625 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-tmp\") pod \"5b9cecef-f242-495d-be32-0942e8aca2bb\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " Apr 16 18:33:03.982677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982646 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-kserve-provision-location\") pod \"5b9cecef-f242-495d-be32-0942e8aca2bb\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " Apr 16 18:33:03.982677 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982669 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-cache\") pod \"5b9cecef-f242-495d-be32-0942e8aca2bb\" (UID: \"5b9cecef-f242-495d-be32-0942e8aca2bb\") " Apr 16 18:33:03.982959 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.982923 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5b9cecef-f242-495d-be32-0942e8aca2bb" (UID: "5b9cecef-f242-495d-be32-0942e8aca2bb"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:03.983018 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.983000 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5b9cecef-f242-495d-be32-0942e8aca2bb" (UID: "5b9cecef-f242-495d-be32-0942e8aca2bb"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:03.983074 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.983036 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-uds\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:33:03.983074 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.983045 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5b9cecef-f242-495d-be32-0942e8aca2bb" (UID: "5b9cecef-f242-495d-be32-0942e8aca2bb"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:03.983393 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.983371 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b9cecef-f242-495d-be32-0942e8aca2bb" (UID: "5b9cecef-f242-495d-be32-0942e8aca2bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:03.984768 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.984741 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9cecef-f242-495d-be32-0942e8aca2bb-kube-api-access-d2874" (OuterVolumeSpecName: "kube-api-access-d2874") pod "5b9cecef-f242-495d-be32-0942e8aca2bb" (UID: "5b9cecef-f242-495d-be32-0942e8aca2bb"). InnerVolumeSpecName "kube-api-access-d2874". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:03.984885 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:03.984870 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9cecef-f242-495d-be32-0942e8aca2bb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5b9cecef-f242-495d-be32-0942e8aca2bb" (UID: "5b9cecef-f242-495d-be32-0942e8aca2bb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:04.084180 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.084141 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b9cecef-f242-495d-be32-0942e8aca2bb-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:33:04.084180 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.084175 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2874\" (UniqueName: \"kubernetes.io/projected/5b9cecef-f242-495d-be32-0942e8aca2bb-kube-api-access-d2874\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:33:04.084180 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.084186 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-tmp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:33:04.084410 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.084196 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:33:04.084410 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.084205 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b9cecef-f242-495d-be32-0942e8aca2bb-tokenizer-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:33:04.272614 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.272579 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx"] Apr 16 18:33:04.279346 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:04.279316 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-78cbf879f6-xm6sx"] Apr 16 18:33:05.648005 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:05.647965 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" path="/var/lib/kubelet/pods/5b9cecef-f242-495d-be32-0942e8aca2bb/volumes" Apr 16 18:33:14.094001 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.093970 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9"] Apr 16 18:33:14.094405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094352 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="storage-initializer" Apr 16 18:33:14.094405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094363 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="storage-initializer" Apr 16 18:33:14.094405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094372 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="main" Apr 16 18:33:14.094405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094378 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="main" Apr 16 18:33:14.094405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094400 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="tokenizer" Apr 16 18:33:14.094405 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094406 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="tokenizer" Apr 16 18:33:14.094600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094465 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="main" Apr 16 18:33:14.094600 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.094475 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b9cecef-f242-495d-be32-0942e8aca2bb" containerName="tokenizer" Apr 16 18:33:14.099518 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.099496 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.101901 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.101879 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:33:14.102497 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.102479 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 18:33:14.102585 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.102484 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:33:14.102585 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.102485 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-j54x9\"" Apr 16 18:33:14.102714 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.102543 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:33:14.110240 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.110219 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9"] Apr 16 18:33:14.176178 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.176145 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.176351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.176197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.176351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.176226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.176351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.176256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.176351 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.176281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef30338-437b-4307-b389-329f8a23e6ad-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.176829 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.176356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f688l\" (UniqueName: \"kubernetes.io/projected/4ef30338-437b-4307-b389-329f8a23e6ad-kube-api-access-f688l\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.277846 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.277808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.277996 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.277860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.277996 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.277889 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.277996 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.277914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef30338-437b-4307-b389-329f8a23e6ad-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.277996 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.277959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f688l\" (UniqueName: \"kubernetes.io/projected/4ef30338-437b-4307-b389-329f8a23e6ad-kube-api-access-f688l\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.278150 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.278007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.278280 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.278255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.278324 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.278291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.278324 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.278310 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.278390 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.278362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.280386 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.280370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef30338-437b-4307-b389-329f8a23e6ad-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.286417 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.286391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f688l\" (UniqueName: \"kubernetes.io/projected/4ef30338-437b-4307-b389-329f8a23e6ad-kube-api-access-f688l\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.410976 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.410895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:14.543275 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.543242 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9"] Apr 16 18:33:14.544350 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:33:14.544320 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef30338_437b_4307_b389_329f8a23e6ad.slice/crio-ec725f1ce08aca2d8d86bbca9c7607b9fc6de570dae182de742eb08f7bd5a11e WatchSource:0}: Error finding container ec725f1ce08aca2d8d86bbca9c7607b9fc6de570dae182de742eb08f7bd5a11e: Status 404 returned error can't find the container with id ec725f1ce08aca2d8d86bbca9c7607b9fc6de570dae182de742eb08f7bd5a11e Apr 16 18:33:14.546347 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.546329 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:33:14.998064 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.998026 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerStarted","Data":"b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae"} Apr 16 18:33:14.998064 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:14.998066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerStarted","Data":"ec725f1ce08aca2d8d86bbca9c7607b9fc6de570dae182de742eb08f7bd5a11e"} Apr 16 18:33:16.003443 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:16.003409 2571 generic.go:358] "Generic (PLEG): container finished" podID="4ef30338-437b-4307-b389-329f8a23e6ad" containerID="b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae" exitCode=0 Apr 16 18:33:16.003863 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:16.003489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerDied","Data":"b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae"} Apr 16 18:33:17.009738 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:17.009678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerStarted","Data":"79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe"} Apr 16 18:33:17.010123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:17.009745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerStarted","Data":"d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696"} Apr 16 18:33:17.010123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:17.009842 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:17.035329 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:17.035272 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" podStartSLOduration=3.035251325 podStartE2EDuration="3.035251325s" podCreationTimestamp="2026-04-16 18:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:17.034331936 +0000 UTC m=+1421.958882589" watchObservedRunningTime="2026-04-16 18:33:17.035251325 +0000 UTC m=+1421.959801980" Apr 16 18:33:24.411680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:24.411636 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:24.411680 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:24.411711 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:24.414513 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:24.414490 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:25.047096 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:25.047065 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:33:46.051367 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:33:46.051337 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:34:35.619618 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:34:35.619589 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:34:35.622819 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:34:35.622795 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:37:31.626606 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.626571 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2"] Apr 16 18:37:31.630561 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.630537 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.633011 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.632986 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-9k8kp\"" Apr 16 18:37:31.633241 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.633076 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 18:37:31.645722 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.645697 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2"] Apr 16 18:37:31.694955 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.694923 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.694955 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.694956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.695164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.695000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.695164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.695052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.695164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.695121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.695164 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.695155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs8d6\" (UniqueName: \"kubernetes.io/projected/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kube-api-access-vs8d6\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796419 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796419 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796653 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs8d6\" (UniqueName: \"kubernetes.io/projected/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kube-api-access-vs8d6\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796871 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796920 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.796966 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.797029 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.796974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.799166 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.799139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.813339 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.813307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs8d6\" (UniqueName: \"kubernetes.io/projected/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kube-api-access-vs8d6\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:31.941660 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:31.941560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:32.084402 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:32.084369 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2"] Apr 16 18:37:32.086098 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:37:32.086070 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb589ac4a_9018_405e_9e0c_b7f5ad7ec215.slice/crio-540c39379ada138eec8d2a9228ed439912539caa9b6dc6e5ce6b7d9fd08f5b7e WatchSource:0}: Error finding container 540c39379ada138eec8d2a9228ed439912539caa9b6dc6e5ce6b7d9fd08f5b7e: Status 404 returned error can't find the container with id 540c39379ada138eec8d2a9228ed439912539caa9b6dc6e5ce6b7d9fd08f5b7e Apr 16 18:37:33.071573 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:33.071538 2571 generic.go:358] "Generic (PLEG): container finished" podID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerID="632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841" exitCode=0 Apr 16 18:37:33.072072 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:33.071593 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerDied","Data":"632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841"} Apr 16 18:37:33.072072 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:33.071615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerStarted","Data":"540c39379ada138eec8d2a9228ed439912539caa9b6dc6e5ce6b7d9fd08f5b7e"} Apr 16 18:37:34.079289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.079254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerStarted","Data":"a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14"} Apr 16 18:37:34.079289 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.079295 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerStarted","Data":"bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30"} Apr 16 18:37:34.079803 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.079347 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:34.105252 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.105202 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" podStartSLOduration=3.105185034 podStartE2EDuration="3.105185034s" podCreationTimestamp="2026-04-16 18:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:34.101537601 +0000 UTC m=+1679.026088252" watchObservedRunningTime="2026-04-16 18:37:34.105185034 +0000 UTC m=+1679.029735684" Apr 16 18:37:34.250068 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.250034 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9"] Apr 16 18:37:34.250496 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.250462 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="main" containerID="cri-o://d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696" gracePeriod=30 Apr 16 18:37:34.250792 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:34.250758 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="tokenizer" containerID="cri-o://79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe" gracePeriod=30 Apr 16 18:37:35.047325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.047274 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.58:8082/healthz\": dial tcp 10.132.0.58:8082: connect: connection refused" Apr 16 18:37:35.086017 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.085983 2571 generic.go:358] "Generic (PLEG): container finished" podID="4ef30338-437b-4307-b389-329f8a23e6ad" containerID="d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696" exitCode=0 Apr 16 18:37:35.086426 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.086055 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerDied","Data":"d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696"} Apr 16 18:37:35.640170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.640135 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:37:35.732822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.732784 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-cache\") pod \"4ef30338-437b-4307-b389-329f8a23e6ad\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " Apr 16 18:37:35.732822 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.732825 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef30338-437b-4307-b389-329f8a23e6ad-tls-certs\") pod \"4ef30338-437b-4307-b389-329f8a23e6ad\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " Apr 16 18:37:35.733089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.732859 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f688l\" (UniqueName: \"kubernetes.io/projected/4ef30338-437b-4307-b389-329f8a23e6ad-kube-api-access-f688l\") pod \"4ef30338-437b-4307-b389-329f8a23e6ad\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " Apr 16 18:37:35.733089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.732925 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-kserve-provision-location\") pod \"4ef30338-437b-4307-b389-329f8a23e6ad\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " Apr 16 18:37:35.733089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733010 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-uds\") pod \"4ef30338-437b-4307-b389-329f8a23e6ad\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " Apr 16 18:37:35.733089 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733077 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-tmp\") pod \"4ef30338-437b-4307-b389-329f8a23e6ad\" (UID: \"4ef30338-437b-4307-b389-329f8a23e6ad\") " Apr 16 18:37:35.733283 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733105 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4ef30338-437b-4307-b389-329f8a23e6ad" (UID: "4ef30338-437b-4307-b389-329f8a23e6ad"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:35.733366 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733320 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4ef30338-437b-4307-b389-329f8a23e6ad" (UID: "4ef30338-437b-4307-b389-329f8a23e6ad"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:35.733454 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733435 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-uds\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.733539 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733469 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.733539 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733482 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4ef30338-437b-4307-b389-329f8a23e6ad" (UID: "4ef30338-437b-4307-b389-329f8a23e6ad"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:35.733925 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.733890 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ef30338-437b-4307-b389-329f8a23e6ad" (UID: "4ef30338-437b-4307-b389-329f8a23e6ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:35.735115 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.735091 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef30338-437b-4307-b389-329f8a23e6ad-kube-api-access-f688l" (OuterVolumeSpecName: "kube-api-access-f688l") pod "4ef30338-437b-4307-b389-329f8a23e6ad" (UID: "4ef30338-437b-4307-b389-329f8a23e6ad"). InnerVolumeSpecName "kube-api-access-f688l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:35.735115 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.735103 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef30338-437b-4307-b389-329f8a23e6ad-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4ef30338-437b-4307-b389-329f8a23e6ad" (UID: "4ef30338-437b-4307-b389-329f8a23e6ad"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:37:35.834181 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.834144 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-tokenizer-tmp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.834181 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.834175 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef30338-437b-4307-b389-329f8a23e6ad-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.834181 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.834184 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f688l\" (UniqueName: \"kubernetes.io/projected/4ef30338-437b-4307-b389-329f8a23e6ad-kube-api-access-f688l\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.834414 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:35.834194 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ef30338-437b-4307-b389-329f8a23e6ad-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:37:36.094221 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.094130 2571 generic.go:358] "Generic (PLEG): container finished" podID="4ef30338-437b-4307-b389-329f8a23e6ad" containerID="79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe" exitCode=0 Apr 16 18:37:36.094221 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.094211 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" Apr 16 18:37:36.094750 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.094207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerDied","Data":"79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe"} Apr 16 18:37:36.094750 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.094315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9" event={"ID":"4ef30338-437b-4307-b389-329f8a23e6ad","Type":"ContainerDied","Data":"ec725f1ce08aca2d8d86bbca9c7607b9fc6de570dae182de742eb08f7bd5a11e"} Apr 16 18:37:36.094750 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.094330 2571 scope.go:117] "RemoveContainer" containerID="79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe" Apr 16 18:37:36.106669 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.106645 2571 scope.go:117] "RemoveContainer" containerID="d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696" Apr 16 18:37:36.115905 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.115884 2571 scope.go:117] "RemoveContainer" containerID="b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae" Apr 16 18:37:36.122110 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.122063 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9"] Apr 16 18:37:36.125206 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.125186 2571 scope.go:117] "RemoveContainer" containerID="79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe" Apr 16 18:37:36.125495 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:37:36.125476 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe\": container with ID starting with 79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe not found: ID does not exist" containerID="79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe" Apr 16 18:37:36.125559 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.125504 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe"} err="failed to get container status \"79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe\": rpc error: code = NotFound desc = could not find container \"79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe\": container with ID starting with 79694220ce072320d0a23e8d26ababb99482535862dad1afaae28a437caca3fe not found: ID does not exist" Apr 16 18:37:36.125559 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.125523 2571 scope.go:117] "RemoveContainer" containerID="d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696" Apr 16 18:37:36.125844 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:37:36.125824 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696\": container with ID starting with d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696 not found: ID does not exist" containerID="d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696" Apr 16 18:37:36.125897 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.125849 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696"} err="failed to get container status \"d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696\": rpc error: code = NotFound desc = could not find container \"d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696\": container with ID starting with d8eee20fbb3e89bf01c2d13b747787f17b3cea970dd7f7aa61d48605f77f7696 not found: ID does not exist" Apr 16 18:37:36.125897 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.125866 2571 scope.go:117] "RemoveContainer" containerID="b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae" Apr 16 18:37:36.126081 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:37:36.126066 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae\": container with ID starting with b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae not found: ID does not exist" containerID="b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae" Apr 16 18:37:36.126123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.126085 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae"} err="failed to get container status \"b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae\": rpc error: code = NotFound desc = could not find container \"b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae\": container with ID starting with b8df53e50a855a10668d65f8ecfc3607a0fec73755614bd1a2047e14648292ae not found: ID does not exist" Apr 16 18:37:36.126170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:36.126148 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheh82m9"] Apr 16 18:37:37.642005 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:37.641972 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" path="/var/lib/kubelet/pods/4ef30338-437b-4307-b389-329f8a23e6ad/volumes" Apr 16 18:37:41.942118 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:41.942077 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:41.942118 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:41.942125 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:41.944974 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:41.944952 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:37:42.121898 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:37:42.121853 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:38:03.126115 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:38:03.126083 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:39:35.658219 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:39:35.658190 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:39:35.660940 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:39:35.660915 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:40:53.999170 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:53.999123 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2"] Apr 16 18:40:53.999717 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:53.999457 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="main" containerID="cri-o://bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30" gracePeriod=30 Apr 16 18:40:53.999717 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:53.999575 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="tokenizer" containerID="cri-o://a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14" gracePeriod=30 Apr 16 18:40:54.126888 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.126854 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm"] Apr 16 18:40:54.127385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127365 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="tokenizer" Apr 16 18:40:54.127385 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127385 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="tokenizer" Apr 16 18:40:54.127553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127416 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="main" Apr 16 18:40:54.127553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127425 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="main" Apr 16 18:40:54.127553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127444 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="storage-initializer" Apr 16 18:40:54.127553 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127452 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="storage-initializer" Apr 16 18:40:54.127772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127555 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="tokenizer" Apr 16 18:40:54.127772 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.127572 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef30338-437b-4307-b389-329f8a23e6ad" containerName="main" Apr 16 18:40:54.130724 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.130702 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.137112 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.137074 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-jvx5r\"" Apr 16 18:40:54.137335 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.137315 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 18:40:54.146115 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.146078 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm"] Apr 16 18:40:54.218968 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.218930 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.218968 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.218980 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/746d2b7f-f418-4055-bb4d-537b10871043-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219194 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/746d2b7f-f418-4055-bb4d-537b10871043-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219194 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219194 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219194 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdvn8\" (UniqueName: \"kubernetes.io/projected/746d2b7f-f418-4055-bb4d-537b10871043-kube-api-access-hdvn8\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219450 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219450 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.219450 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.219351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/746d2b7f-f418-4055-bb4d-537b10871043-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320221 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320221 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320209 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/746d2b7f-f418-4055-bb4d-537b10871043-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/746d2b7f-f418-4055-bb4d-537b10871043-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/746d2b7f-f418-4055-bb4d-537b10871043-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320414 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdvn8\" (UniqueName: \"kubernetes.io/projected/746d2b7f-f418-4055-bb4d-537b10871043-kube-api-access-hdvn8\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320847 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320474 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320847 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320991 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.320991 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.320964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/746d2b7f-f418-4055-bb4d-537b10871043-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.321250 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.321003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.323292 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.323236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/746d2b7f-f418-4055-bb4d-537b10871043-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.323486 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.323463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/746d2b7f-f418-4055-bb4d-537b10871043-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.328978 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.328954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/746d2b7f-f418-4055-bb4d-537b10871043-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.329114 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.329027 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdvn8\" (UniqueName: \"kubernetes.io/projected/746d2b7f-f418-4055-bb4d-537b10871043-kube-api-access-hdvn8\") pod \"router-gateway-2-openshift-default-6866b85949-k7nfm\" (UID: \"746d2b7f-f418-4055-bb4d-537b10871043\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.451167 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.451118 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:54.618940 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.618908 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm"] Apr 16 18:40:54.620750 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:40:54.620723 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod746d2b7f_f418_4055_bb4d_537b10871043.slice/crio-c6faea3298b83beafae40a9db354ec0f1f5bc83b3fa2b207889af7014169c446 WatchSource:0}: Error finding container c6faea3298b83beafae40a9db354ec0f1f5bc83b3fa2b207889af7014169c446: Status 404 returned error can't find the container with id c6faea3298b83beafae40a9db354ec0f1f5bc83b3fa2b207889af7014169c446 Apr 16 18:40:54.622962 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.622943 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:40:54.623288 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.623259 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:40:54.623382 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.623323 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:40:54.623382 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.623352 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 18:40:54.905664 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.905566 2571 generic.go:358] "Generic (PLEG): container finished" podID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerID="bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30" exitCode=0 Apr 16 18:40:54.905664 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.905643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerDied","Data":"bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30"} Apr 16 18:40:54.907213 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.907184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" event={"ID":"746d2b7f-f418-4055-bb4d-537b10871043","Type":"ContainerStarted","Data":"7580a4d3bd93431e8ca60bc2d367c3a7a139f7d9ac0c9b54d15de517423576c1"} Apr 16 18:40:54.907325 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.907220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" event={"ID":"746d2b7f-f418-4055-bb4d-537b10871043","Type":"ContainerStarted","Data":"c6faea3298b83beafae40a9db354ec0f1f5bc83b3fa2b207889af7014169c446"} Apr 16 18:40:54.929472 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:54.929403 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" podStartSLOduration=0.929383103 podStartE2EDuration="929.383103ms" podCreationTimestamp="2026-04-16 18:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:54.927022637 +0000 UTC m=+1879.851573289" watchObservedRunningTime="2026-04-16 18:40:54.929383103 +0000 UTC m=+1879.853933754" Apr 16 18:40:55.451611 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.451583 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:55.456773 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.456674 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:55.473629 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.473602 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:40:55.632212 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632114 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tls-certs\") pod \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " Apr 16 18:40:55.632212 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632185 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-cache\") pod \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " Apr 16 18:40:55.632452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632248 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs8d6\" (UniqueName: \"kubernetes.io/projected/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kube-api-access-vs8d6\") pod \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " Apr 16 18:40:55.632452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632284 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-uds\") pod \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " Apr 16 18:40:55.632452 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632354 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kserve-provision-location\") pod \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " Apr 16 18:40:55.632620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632463 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b589ac4a-9018-405e-9e0c-b7f5ad7ec215" (UID: "b589ac4a-9018-405e-9e0c-b7f5ad7ec215"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:40:55.632620 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632484 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-tmp\") pod \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\" (UID: \"b589ac4a-9018-405e-9e0c-b7f5ad7ec215\") " Apr 16 18:40:55.632776 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632623 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b589ac4a-9018-405e-9e0c-b7f5ad7ec215" (UID: "b589ac4a-9018-405e-9e0c-b7f5ad7ec215"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:40:55.632776 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632717 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b589ac4a-9018-405e-9e0c-b7f5ad7ec215" (UID: "b589ac4a-9018-405e-9e0c-b7f5ad7ec215"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:40:55.632988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632946 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-uds\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:40:55.632988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632968 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-tmp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:40:55.632988 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.632980 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tokenizer-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:40:55.633340 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.633312 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b589ac4a-9018-405e-9e0c-b7f5ad7ec215" (UID: "b589ac4a-9018-405e-9e0c-b7f5ad7ec215"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:40:55.634616 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.634588 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b589ac4a-9018-405e-9e0c-b7f5ad7ec215" (UID: "b589ac4a-9018-405e-9e0c-b7f5ad7ec215"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:40:55.634783 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.634764 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kube-api-access-vs8d6" (OuterVolumeSpecName: "kube-api-access-vs8d6") pod "b589ac4a-9018-405e-9e0c-b7f5ad7ec215" (UID: "b589ac4a-9018-405e-9e0c-b7f5ad7ec215"). InnerVolumeSpecName "kube-api-access-vs8d6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:40:55.734451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.734413 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:40:55.734451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.734444 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vs8d6\" (UniqueName: \"kubernetes.io/projected/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kube-api-access-vs8d6\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:40:55.734451 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.734456 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b589ac4a-9018-405e-9e0c-b7f5ad7ec215-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:40:55.912947 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.912859 2571 generic.go:358] "Generic (PLEG): container finished" podID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerID="a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14" exitCode=0 Apr 16 18:40:55.913103 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.912943 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerDied","Data":"a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14"} Apr 16 18:40:55.913103 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.912987 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" event={"ID":"b589ac4a-9018-405e-9e0c-b7f5ad7ec215","Type":"ContainerDied","Data":"540c39379ada138eec8d2a9228ed439912539caa9b6dc6e5ce6b7d9fd08f5b7e"} Apr 16 18:40:55.913103 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.913002 2571 scope.go:117] "RemoveContainer" containerID="a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14" Apr 16 18:40:55.913103 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.912952 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2" Apr 16 18:40:55.913544 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.913517 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:55.914723 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.914705 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-k7nfm" Apr 16 18:40:55.924721 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.924646 2571 scope.go:117] "RemoveContainer" containerID="bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30" Apr 16 18:40:55.934648 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.934587 2571 scope.go:117] "RemoveContainer" containerID="632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841" Apr 16 18:40:55.947469 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.947449 2571 scope.go:117] "RemoveContainer" containerID="a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14" Apr 16 18:40:55.948434 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:40:55.948297 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14\": container with ID starting with a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14 not found: ID does not exist" containerID="a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14" Apr 16 18:40:55.948434 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.948336 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14"} err="failed to get container status \"a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14\": rpc error: code = NotFound desc = could not find container \"a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14\": container with ID starting with a9c78d28f5ad3ccbb1aff9bb9ed0ff7a4ee6a08a4380821b09e173e9efdedc14 not found: ID does not exist" Apr 16 18:40:55.948434 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.948364 2571 scope.go:117] "RemoveContainer" containerID="bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30" Apr 16 18:40:55.948838 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:40:55.948800 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30\": container with ID starting with bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30 not found: ID does not exist" containerID="bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30" Apr 16 18:40:55.948933 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.948832 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30"} err="failed to get container status \"bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30\": rpc error: code = NotFound desc = could not find container \"bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30\": container with ID starting with bd5778f107a9f9e9dfd3749e5d3a2c31d19ace9a2331fba0ba6f1bf43de0cf30 not found: ID does not exist" Apr 16 18:40:55.948933 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.948854 2571 scope.go:117] "RemoveContainer" containerID="632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841" Apr 16 18:40:55.949109 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:40:55.949085 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841\": container with ID starting with 632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841 not found: ID does not exist" containerID="632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841" Apr 16 18:40:55.949202 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.949114 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841"} err="failed to get container status \"632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841\": rpc error: code = NotFound desc = could not find container \"632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841\": container with ID starting with 632ddee75322dfa1e799e2e3115e0cb055a56c539edc445f8f700b3538fd9841 not found: ID does not exist" Apr 16 18:40:55.959894 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.959863 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2"] Apr 16 18:40:55.969228 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:55.969182 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-fc9cc25ht2"] Apr 16 18:40:57.650935 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:40:57.650887 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" path="/var/lib/kubelet/pods/b589ac4a-9018-405e-9e0c-b7f5ad7ec215/volumes" Apr 16 18:41:16.548900 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.548857 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv"] Apr 16 18:41:16.549477 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549458 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="main" Apr 16 18:41:16.549520 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549481 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="main" Apr 16 18:41:16.549566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549527 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="storage-initializer" Apr 16 18:41:16.549566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549538 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="storage-initializer" Apr 16 18:41:16.549566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549549 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="tokenizer" Apr 16 18:41:16.549566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549557 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="tokenizer" Apr 16 18:41:16.549709 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549656 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="main" Apr 16 18:41:16.549709 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.549672 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b589ac4a-9018-405e-9e0c-b7f5ad7ec215" containerName="tokenizer" Apr 16 18:41:16.554072 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.554055 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.556305 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.556281 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7fwg5\"" Apr 16 18:41:16.556305 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.556300 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-nc8tk\"" Apr 16 18:41:16.556460 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.556443 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:41:16.563290 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.563265 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv"] Apr 16 18:41:16.626123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.626087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.626123 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.626127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-444xm\" (UniqueName: \"kubernetes.io/projected/2846b930-3487-427e-8f16-abad8b55e31e-kube-api-access-444xm\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.626332 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.626167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.626332 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.626216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.626332 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.626289 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.626441 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.626331 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2846b930-3487-427e-8f16-abad8b55e31e-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727430 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727430 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-444xm\" (UniqueName: \"kubernetes.io/projected/2846b930-3487-427e-8f16-abad8b55e31e-kube-api-access-444xm\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727747 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727747 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727747 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727747 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2846b930-3487-427e-8f16-abad8b55e31e-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727967 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.727967 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.727902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.728121 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.728101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.728401 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.728377 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.730648 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.730614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2846b930-3487-427e-8f16-abad8b55e31e-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.736881 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.736851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-444xm\" (UniqueName: \"kubernetes.io/projected/2846b930-3487-427e-8f16-abad8b55e31e-kube-api-access-444xm\") pod \"router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:16.864674 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:16.864582 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:17.002116 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:17.002085 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv"] Apr 16 18:41:17.007230 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:41:17.007195 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2846b930_3487_427e_8f16_abad8b55e31e.slice/crio-38b8d457d009b4ad2b4efe48783ec21d43759a701e9b07299e5ef535d3561810 WatchSource:0}: Error finding container 38b8d457d009b4ad2b4efe48783ec21d43759a701e9b07299e5ef535d3561810: Status 404 returned error can't find the container with id 38b8d457d009b4ad2b4efe48783ec21d43759a701e9b07299e5ef535d3561810 Apr 16 18:41:18.009171 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:18.009132 2571 generic.go:358] "Generic (PLEG): container finished" podID="2846b930-3487-427e-8f16-abad8b55e31e" containerID="0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9" exitCode=0 Apr 16 18:41:18.009569 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:18.009219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerDied","Data":"0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9"} Apr 16 18:41:18.009569 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:18.009260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerStarted","Data":"38b8d457d009b4ad2b4efe48783ec21d43759a701e9b07299e5ef535d3561810"} Apr 16 18:41:19.015458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:19.015420 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerStarted","Data":"496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95"} Apr 16 18:41:19.015458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:19.015460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerStarted","Data":"b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670"} Apr 16 18:41:19.016047 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:19.015604 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:19.038766 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:19.038718 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" podStartSLOduration=3.038702216 podStartE2EDuration="3.038702216s" podCreationTimestamp="2026-04-16 18:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:19.037139958 +0000 UTC m=+1903.961690613" watchObservedRunningTime="2026-04-16 18:41:19.038702216 +0000 UTC m=+1903.963252858" Apr 16 18:41:26.864924 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:26.864879 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:26.864924 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:26.864925 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:26.867612 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:26.867588 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:27.052566 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:27.052527 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:41:48.057391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:41:48.057360 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:43:17.832771 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:17.832731 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv"] Apr 16 18:43:17.833243 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:17.833024 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="main" containerID="cri-o://b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670" gracePeriod=30 Apr 16 18:43:17.833243 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:17.833052 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="tokenizer" containerID="cri-o://496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95" gracePeriod=30 Apr 16 18:43:18.056871 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:43:18.056832 2571 logging.go:55] [core] [Channel #551 SubChannel #552]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.61:9003", ServerName: "10.132.0.61:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.61:9003: connect: connection refused" Apr 16 18:43:18.525966 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:18.525929 2571 generic.go:358] "Generic (PLEG): container finished" podID="2846b930-3487-427e-8f16-abad8b55e31e" containerID="b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670" exitCode=0 Apr 16 18:43:18.526179 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:18.526004 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerDied","Data":"b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670"} Apr 16 18:43:19.056751 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.056717 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.61:9003\" within 1s: context deadline exceeded" Apr 16 18:43:19.200319 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.200290 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:43:19.261061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.260964 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-uds\") pod \"2846b930-3487-427e-8f16-abad8b55e31e\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " Apr 16 18:43:19.261061 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261001 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-444xm\" (UniqueName: \"kubernetes.io/projected/2846b930-3487-427e-8f16-abad8b55e31e-kube-api-access-444xm\") pod \"2846b930-3487-427e-8f16-abad8b55e31e\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " Apr 16 18:43:19.261303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261074 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-cache\") pod \"2846b930-3487-427e-8f16-abad8b55e31e\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " Apr 16 18:43:19.261303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261110 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-tmp\") pod \"2846b930-3487-427e-8f16-abad8b55e31e\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " Apr 16 18:43:19.261303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261151 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2846b930-3487-427e-8f16-abad8b55e31e-tls-certs\") pod \"2846b930-3487-427e-8f16-abad8b55e31e\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " Apr 16 18:43:19.261303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261211 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-kserve-provision-location\") pod \"2846b930-3487-427e-8f16-abad8b55e31e\" (UID: \"2846b930-3487-427e-8f16-abad8b55e31e\") " Apr 16 18:43:19.261303 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261222 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2846b930-3487-427e-8f16-abad8b55e31e" (UID: "2846b930-3487-427e-8f16-abad8b55e31e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.261512 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261317 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2846b930-3487-427e-8f16-abad8b55e31e" (UID: "2846b930-3487-427e-8f16-abad8b55e31e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.261512 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261398 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2846b930-3487-427e-8f16-abad8b55e31e" (UID: "2846b930-3487-427e-8f16-abad8b55e31e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.261512 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261465 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-uds\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.261512 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261480 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-cache\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.261512 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.261489 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-tokenizer-tmp\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.262022 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.262001 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2846b930-3487-427e-8f16-abad8b55e31e" (UID: "2846b930-3487-427e-8f16-abad8b55e31e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.263250 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.263229 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2846b930-3487-427e-8f16-abad8b55e31e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2846b930-3487-427e-8f16-abad8b55e31e" (UID: "2846b930-3487-427e-8f16-abad8b55e31e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:43:19.263344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.263230 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2846b930-3487-427e-8f16-abad8b55e31e-kube-api-access-444xm" (OuterVolumeSpecName: "kube-api-access-444xm") pod "2846b930-3487-427e-8f16-abad8b55e31e" (UID: "2846b930-3487-427e-8f16-abad8b55e31e"). InnerVolumeSpecName "kube-api-access-444xm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:19.362469 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.362434 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2846b930-3487-427e-8f16-abad8b55e31e-kserve-provision-location\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.362469 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.362464 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-444xm\" (UniqueName: \"kubernetes.io/projected/2846b930-3487-427e-8f16-abad8b55e31e-kube-api-access-444xm\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.362469 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.362474 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2846b930-3487-427e-8f16-abad8b55e31e-tls-certs\") on node \"ip-10-0-142-228.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.531483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.531386 2571 generic.go:358] "Generic (PLEG): container finished" podID="2846b930-3487-427e-8f16-abad8b55e31e" containerID="496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95" exitCode=0 Apr 16 18:43:19.531483 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.531474 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" Apr 16 18:43:19.531733 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.531472 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerDied","Data":"496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95"} Apr 16 18:43:19.531733 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.531583 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv" event={"ID":"2846b930-3487-427e-8f16-abad8b55e31e","Type":"ContainerDied","Data":"38b8d457d009b4ad2b4efe48783ec21d43759a701e9b07299e5ef535d3561810"} Apr 16 18:43:19.531733 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.531602 2571 scope.go:117] "RemoveContainer" containerID="496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95" Apr 16 18:43:19.540976 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.540958 2571 scope.go:117] "RemoveContainer" containerID="b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670" Apr 16 18:43:19.549256 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.549232 2571 scope.go:117] "RemoveContainer" containerID="0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9" Apr 16 18:43:19.555814 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.555787 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv"] Apr 16 18:43:19.559711 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.559670 2571 scope.go:117] "RemoveContainer" containerID="496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95" Apr 16 18:43:19.560009 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:43:19.559984 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95\": container with ID starting with 496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95 not found: ID does not exist" containerID="496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95" Apr 16 18:43:19.560080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.560021 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95"} err="failed to get container status \"496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95\": rpc error: code = NotFound desc = could not find container \"496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95\": container with ID starting with 496db31facbc1be3d27b52b96f89c41b48003121a105a731a194ff0d4f69ae95 not found: ID does not exist" Apr 16 18:43:19.560080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.560034 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-756d86dfbrg6cv"] Apr 16 18:43:19.560080 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.560045 2571 scope.go:117] "RemoveContainer" containerID="b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670" Apr 16 18:43:19.560299 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:43:19.560282 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670\": container with ID starting with b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670 not found: ID does not exist" containerID="b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670" Apr 16 18:43:19.560344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.560306 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670"} err="failed to get container status \"b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670\": rpc error: code = NotFound desc = could not find container \"b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670\": container with ID starting with b2dd3f61c01099ab022c8a47965d6194eb048e88c00f13d3e2089a487be7e670 not found: ID does not exist" Apr 16 18:43:19.560344 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.560322 2571 scope.go:117] "RemoveContainer" containerID="0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9" Apr 16 18:43:19.560568 ip-10-0-142-228 kubenswrapper[2571]: E0416 18:43:19.560551 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9\": container with ID starting with 0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9 not found: ID does not exist" containerID="0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9" Apr 16 18:43:19.560609 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.560571 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9"} err="failed to get container status \"0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9\": rpc error: code = NotFound desc = could not find container \"0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9\": container with ID starting with 0b8179a405f63a77f0b61b11059e6b1c8fdf2a1ab2040bfa8849a06a0887d1d9 not found: ID does not exist" Apr 16 18:43:19.642025 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:19.641993 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2846b930-3487-427e-8f16-abad8b55e31e" path="/var/lib/kubelet/pods/2846b930-3487-427e-8f16-abad8b55e31e/volumes" Apr 16 18:43:33.525322 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:33.525285 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:34.618570 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:34.618536 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:35.692679 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:35.692649 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:36.755521 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:36.755488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:37.847508 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:37.847474 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:38.922097 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:38.922069 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:39.974786 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:39.974760 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:41.029469 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:41.029428 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:42.099564 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:42.099536 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:43.168854 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:43.168813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:44.214802 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:44.214767 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:45.274790 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:45.274761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:46.476480 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:46.476441 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:47.578116 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:47.578090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-k7nfm_746d2b7f-f418-4055-bb4d-537b10871043/istio-proxy/0.log" Apr 16 18:43:48.755580 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:48.755550 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-l7476_ed490bc6-e334-4254-8567-902724b2a88e/discovery/0.log" Apr 16 18:43:48.769174 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:48.769144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gqksz_839f2fc2-b2e3-439f-b39e-b7a645beaf48/istio-proxy/0.log" Apr 16 18:43:48.787535 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:48.787502 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8459479994-94kcv_1c2effa7-2ad2-44ba-aad9-85b432e50f7e/router/0.log" Apr 16 18:43:49.623884 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:49.623852 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-l7476_ed490bc6-e334-4254-8567-902724b2a88e/discovery/0.log" Apr 16 18:43:49.638655 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:49.638631 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gqksz_839f2fc2-b2e3-439f-b39e-b7a645beaf48/istio-proxy/0.log" Apr 16 18:43:49.657659 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:49.657601 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8459479994-94kcv_1c2effa7-2ad2-44ba-aad9-85b432e50f7e/router/0.log" Apr 16 18:43:50.473076 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:50.473047 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-8fwsb_fa8f1e9c-361b-4bc5-9656-85377e6587a7/manager/0.log" Apr 16 18:43:50.488006 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:50.487977 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-4qks6_ca47344b-f70c-4c79-b44e-89867551a23e/manager/0.log" Apr 16 18:43:50.562832 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:50.562790 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-6bv4j_7974d0e8-54e4-4604-9393-05718ce1254f/manager/0.log" Apr 16 18:43:55.923445 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:55.923419 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ksk5b_bbf3171b-ab57-4ca5-93df-e38037360c5b/global-pull-secret-syncer/0.log" Apr 16 18:43:56.018843 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:56.018814 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8jkg6_d9b2f58f-9716-4752-b28b-793007f4eb48/konnectivity-agent/0.log" Apr 16 18:43:56.130366 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:43:56.130333 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-228.ec2.internal_f66de67f3831643d984f3539ec96bac5/haproxy/0.log" Apr 16 18:44:00.378632 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:00.378590 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-8fwsb_fa8f1e9c-361b-4bc5-9656-85377e6587a7/manager/0.log" Apr 16 18:44:00.407155 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:00.407122 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-4qks6_ca47344b-f70c-4c79-b44e-89867551a23e/manager/0.log" Apr 16 18:44:00.564735 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:00.564680 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-6bv4j_7974d0e8-54e4-4604-9393-05718ce1254f/manager/0.log" Apr 16 18:44:01.735509 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:01.735478 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/1.log" Apr 16 18:44:01.814603 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:01.814573 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-g75tr_48231118-0790-422a-b4db-213ba79fda5b/cluster-monitoring-operator/0.log" Apr 16 18:44:01.976810 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:01.976783 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-kwbtf_3db24e4b-40c0-4c3a-91d1-6c2cc0904f8a/monitoring-plugin/0.log" Apr 16 18:44:02.018441 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.018357 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2d7kk_2b610fe3-ab36-4043-8263-fcb26b8dbd58/node-exporter/0.log" Apr 16 18:44:02.052492 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.052452 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2d7kk_2b610fe3-ab36-4043-8263-fcb26b8dbd58/kube-rbac-proxy/0.log" Apr 16 18:44:02.080134 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.080109 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2d7kk_2b610fe3-ab36-4043-8263-fcb26b8dbd58/init-textfile/0.log" Apr 16 18:44:02.301193 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.301095 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-h8cvn_4fb2e2d9-d9cf-4abc-94f7-195ef81bc483/kube-rbac-proxy-main/0.log" Apr 16 18:44:02.330633 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.330591 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-h8cvn_4fb2e2d9-d9cf-4abc-94f7-195ef81bc483/kube-rbac-proxy-self/0.log" Apr 16 18:44:02.355954 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.355929 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-h8cvn_4fb2e2d9-d9cf-4abc-94f7-195ef81bc483/openshift-state-metrics/0.log" Apr 16 18:44:02.783971 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.783944 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4899b649-g46dq_b942ff78-36e1-45e2-bf1f-bec607da9918/thanos-query/0.log" Apr 16 18:44:02.810252 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.810222 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4899b649-g46dq_b942ff78-36e1-45e2-bf1f-bec607da9918/kube-rbac-proxy-web/0.log" Apr 16 18:44:02.836241 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.836216 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4899b649-g46dq_b942ff78-36e1-45e2-bf1f-bec607da9918/kube-rbac-proxy/0.log" Apr 16 18:44:02.862503 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.862470 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4899b649-g46dq_b942ff78-36e1-45e2-bf1f-bec607da9918/prom-label-proxy/0.log" Apr 16 18:44:02.886627 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.886601 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4899b649-g46dq_b942ff78-36e1-45e2-bf1f-bec607da9918/kube-rbac-proxy-rules/0.log" Apr 16 18:44:02.911218 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:02.911187 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4899b649-g46dq_b942ff78-36e1-45e2-bf1f-bec607da9918/kube-rbac-proxy-metrics/0.log" Apr 16 18:44:03.927737 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:03.927710 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-wnnkf_5cbca814-7996-4973-bc09-b736c26d6348/networking-console-plugin/0.log" Apr 16 18:44:04.472370 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472336 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p"] Apr 16 18:44:04.472771 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472757 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="storage-initializer" Apr 16 18:44:04.472821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472773 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="storage-initializer" Apr 16 18:44:04.472821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472785 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="tokenizer" Apr 16 18:44:04.472821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472791 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="tokenizer" Apr 16 18:44:04.472821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472809 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="main" Apr 16 18:44:04.472821 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472814 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="main" Apr 16 18:44:04.472980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472875 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="tokenizer" Apr 16 18:44:04.472980 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.472885 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2846b930-3487-427e-8f16-abad8b55e31e" containerName="main" Apr 16 18:44:04.476027 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.476010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.478456 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.478277 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qmqrm\"/\"openshift-service-ca.crt\"" Apr 16 18:44:04.478456 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.478344 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qmqrm\"/\"default-dockercfg-5hdfg\"" Apr 16 18:44:04.478456 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.478440 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qmqrm\"/\"kube-root-ca.crt\"" Apr 16 18:44:04.484082 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.484061 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p"] Apr 16 18:44:04.572976 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.572941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-lib-modules\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.573162 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.573067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-sys\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.573162 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.573105 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-podres\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.573162 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.573129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-proc\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.573273 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.573154 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpdx\" (UniqueName: \"kubernetes.io/projected/5ca86521-be76-4e83-ac8a-b5315e10220f-kube-api-access-4qpdx\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.673767 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-lib-modules\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.673948 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-sys\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.673948 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-podres\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.673948 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-proc\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.673948 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-lib-modules\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.673948 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpdx\" (UniqueName: \"kubernetes.io/projected/5ca86521-be76-4e83-ac8a-b5315e10220f-kube-api-access-4qpdx\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.674136 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-sys\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.674136 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-proc\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.674136 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.673997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ca86521-be76-4e83-ac8a-b5315e10220f-podres\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.682613 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.682578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpdx\" (UniqueName: \"kubernetes.io/projected/5ca86521-be76-4e83-ac8a-b5315e10220f-kube-api-access-4qpdx\") pod \"perf-node-gather-daemonset-tjh4p\" (UID: \"5ca86521-be76-4e83-ac8a-b5315e10220f\") " pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.786661 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.786566 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:04.888482 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.888436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cbbf69b9b-shr2f_9c23d52c-1bac-4e48-b575-90a96e26c043/console/0.log" Apr 16 18:44:04.919598 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:04.919565 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p"] Apr 16 18:44:04.922110 ip-10-0-142-228 kubenswrapper[2571]: W0416 18:44:04.922087 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5ca86521_be76_4e83_ac8a_b5315e10220f.slice/crio-5c99ae27d8a2df2aab747c0153107fac09c668bfd9ac076e5c731fad55947d2d WatchSource:0}: Error finding container 5c99ae27d8a2df2aab747c0153107fac09c668bfd9ac076e5c731fad55947d2d: Status 404 returned error can't find the container with id 5c99ae27d8a2df2aab747c0153107fac09c668bfd9ac076e5c731fad55947d2d Apr 16 18:44:05.360318 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:05.360234 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-wdmgj_ebd535ee-8abb-4929-ad09-ec940628fe6a/volume-data-source-validator/0.log" Apr 16 18:44:05.720969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:05.720929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" event={"ID":"5ca86521-be76-4e83-ac8a-b5315e10220f","Type":"ContainerStarted","Data":"41c529ffef045915c44dddd401935126a5f159cc79f84a60959e3ac626a86eba"} Apr 16 18:44:05.720969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:05.720969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" event={"ID":"5ca86521-be76-4e83-ac8a-b5315e10220f","Type":"ContainerStarted","Data":"5c99ae27d8a2df2aab747c0153107fac09c668bfd9ac076e5c731fad55947d2d"} Apr 16 18:44:05.721293 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:05.720988 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:05.737271 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:05.737215 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" podStartSLOduration=1.737200023 podStartE2EDuration="1.737200023s" podCreationTimestamp="2026-04-16 18:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:05.736286382 +0000 UTC m=+2070.660837037" watchObservedRunningTime="2026-04-16 18:44:05.737200023 +0000 UTC m=+2070.661750673" Apr 16 18:44:06.098576 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:06.098494 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5hd6g_d07250da-0b72-4bc8-9129-e53b19a95890/dns/0.log" Apr 16 18:44:06.121901 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:06.121869 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5hd6g_d07250da-0b72-4bc8-9129-e53b19a95890/kube-rbac-proxy/0.log" Apr 16 18:44:06.249669 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:06.249641 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7ldnx_382c7696-64ec-4dbb-9432-e6ac1f3479d8/dns-node-resolver/0.log" Apr 16 18:44:06.775560 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:06.775530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4crql_bb99ad62-9922-4bfa-94da-001321cb977d/node-ca/0.log" Apr 16 18:44:07.672948 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:07.672916 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-l7476_ed490bc6-e334-4254-8567-902724b2a88e/discovery/0.log" Apr 16 18:44:07.697375 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:07.697344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gqksz_839f2fc2-b2e3-439f-b39e-b7a645beaf48/istio-proxy/0.log" Apr 16 18:44:07.724267 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:07.724237 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8459479994-94kcv_1c2effa7-2ad2-44ba-aad9-85b432e50f7e/router/0.log" Apr 16 18:44:08.207296 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:08.207263 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n9vj2_991e5741-2829-429b-a2bd-759f5392a792/serve-healthcheck-canary/0.log" Apr 16 18:44:08.666072 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:08.665983 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-98zlj_60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3/insights-operator/0.log" Apr 16 18:44:08.666248 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:08.666112 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-98zlj_60c658cc-6c75-4ae1-a6a8-d29a55bfd5a3/insights-operator/1.log" Apr 16 18:44:08.686852 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:08.686827 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jnxv6_70d13903-4740-4a09-aeb9-aec340552ebf/kube-rbac-proxy/0.log" Apr 16 18:44:08.708810 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:08.708790 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jnxv6_70d13903-4740-4a09-aeb9-aec340552ebf/exporter/0.log" Apr 16 18:44:08.733747 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:08.733713 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jnxv6_70d13903-4740-4a09-aeb9-aec340552ebf/extractor/0.log" Apr 16 18:44:11.402910 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:11.402880 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-mtzl8_b10ddccd-bd2f-41fe-a8f1-681e1f0030b3/openshift-lws-operator/0.log" Apr 16 18:44:11.735012 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:11.734984 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qmqrm/perf-node-gather-daemonset-tjh4p" Apr 16 18:44:11.949279 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:11.949249 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-65589c6846-8q79f_7739e1d0-3989-401f-b25d-9a79eb91b7fa/manager/0.log" Apr 16 18:44:12.026373 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:12.026292 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-nxzz9_196b7a49-c2eb-4e0c-8185-5fb955cdb8eb/server/0.log" Apr 16 18:44:12.220122 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:12.220085 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-zw59l_16f30508-81c2-4329-9308-c29e56ea7cdb/manager/0.log" Apr 16 18:44:12.271320 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:12.271287 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-9x7pk_8ffccd3f-00c7-45cb-a0c4-cf40117a8e42/seaweedfs/0.log" Apr 16 18:44:17.019635 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:17.019591 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-lpnp8_6f673f44-4374-41cb-8649-2d63280b1dcb/migrator/0.log" Apr 16 18:44:17.052922 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:17.052894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-lpnp8_6f673f44-4374-41cb-8649-2d63280b1dcb/graceful-termination/0.log" Apr 16 18:44:17.467083 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:17.467054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-xwqsm_7cd24549-bac0-49c2-ab16-4e779bd2e01e/kube-storage-version-migrator-operator/1.log" Apr 16 18:44:17.467828 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:17.467811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-xwqsm_7cd24549-bac0-49c2-ab16-4e779bd2e01e/kube-storage-version-migrator-operator/0.log" Apr 16 18:44:18.641970 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.641941 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/kube-multus-additional-cni-plugins/0.log" Apr 16 18:44:18.668322 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.668277 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/egress-router-binary-copy/0.log" Apr 16 18:44:18.692234 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.692202 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/cni-plugins/0.log" Apr 16 18:44:18.716763 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.716735 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/bond-cni-plugin/0.log" Apr 16 18:44:18.740969 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.740938 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/routeoverride-cni/0.log" Apr 16 18:44:18.767713 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.767667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/whereabouts-cni-bincopy/0.log" Apr 16 18:44:18.792412 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:18.792384 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-54j8c_df565fbf-1e31-4d50-9c3f-fbc370ba976a/whereabouts-cni/0.log" Apr 16 18:44:19.208977 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:19.208946 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tvhn4_c627e69d-5828-401b-ad05-a17a07d351bf/kube-multus/0.log" Apr 16 18:44:19.347439 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:19.347409 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lnrzm_9d27531f-08c4-4c67-974c-31cacc77b8be/network-metrics-daemon/0.log" Apr 16 18:44:19.369890 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:19.369863 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lnrzm_9d27531f-08c4-4c67-974c-31cacc77b8be/kube-rbac-proxy/0.log" Apr 16 18:44:20.507458 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.507362 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/ovn-controller/0.log" Apr 16 18:44:20.537391 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.537336 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/ovn-acl-logging/0.log" Apr 16 18:44:20.561866 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.561833 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/kube-rbac-proxy-node/0.log" Apr 16 18:44:20.585525 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.585500 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:44:20.608152 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.608124 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/northd/0.log" Apr 16 18:44:20.631757 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.631728 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/nbdb/0.log" Apr 16 18:44:20.654510 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.654484 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/sbdb/0.log" Apr 16 18:44:20.756388 ip-10-0-142-228 kubenswrapper[2571]: I0416 18:44:20.756356 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mj9tk_c8b82b47-ada7-4f38-9b8a-d7aa9bdb6be3/ovnkube-controller/0.log"