Apr 24 23:53:40.234791 ip-10-0-129-4 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:40.715753 ip-10-0-129-4 kubenswrapper[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:40.715753 ip-10-0-129-4 kubenswrapper[2559]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:40.715753 ip-10-0-129-4 kubenswrapper[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:40.715753 ip-10-0-129-4 kubenswrapper[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:40.715753 ip-10-0-129-4 kubenswrapper[2559]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:40.719702 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.719616 2559 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:40.721975 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721960 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:40.721975 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721975 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721979 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721982 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721985 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721988 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721991 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721994 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.721997 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722000 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722002 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722005 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722007 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722010 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722013 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722016 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722018 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722021 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722028 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722031 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722033 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:40.722037 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722036 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722039 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722041 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722044 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722047 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722050 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722052 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722055 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722059 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722063 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722066 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722068 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722071 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722074 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722077 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722079 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722082 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722084 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722087 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:40.722527 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722089 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722092 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722095 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722097 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722099 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722102 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722104 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722107 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722109 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722111 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722114 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722116 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722119 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722123 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722126 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722130 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722133 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722135 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722138 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:40.723021 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722141 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722143 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722146 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722148 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722151 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722154 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722156 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722159 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722162 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722164 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722167 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722169 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722172 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722175 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722179 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722181 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722184 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722186 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722189 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722192 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:40.723492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722195 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722198 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722200 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722203 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722206 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722208 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722211 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722595 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722601 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722604 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722607 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722609 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722612 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722614 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722617 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722620 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722623 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722625 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722628 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:40.723976 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722631 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722634 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722636 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722639 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722641 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722644 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722647 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722650 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722654 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722657 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722660 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722664 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722667 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722670 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722674 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722677 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722680 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722683 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722686 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:40.724451 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722688 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722691 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722694 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722697 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722700 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722702 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722705 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722707 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722710 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722712 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722715 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722718 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722720 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722723 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722726 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722730 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722732 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722735 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722738 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722740 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:40.724944 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722743 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722745 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722748 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722750 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722753 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722755 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722758 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722760 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722763 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722765 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722768 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722772 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722774 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722777 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722780 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722783 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722785 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722788 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722790 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722793 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:40.725492 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722795 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722798 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722800 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722803 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722805 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722807 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722810 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722813 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722815 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722818 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722821 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722823 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722826 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722828 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.722831 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723533 2559 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723542 2559 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723549 2559 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723554 2559 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723558 2559 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723562 2559 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:40.726010 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723566 2559 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723571 2559 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723574 2559 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723577 2559 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723580 2559 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723584 2559 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723587 2559 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723590 2559 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723593 2559 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723596 2559 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723599 2559 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723602 2559 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723605 2559 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723609 2559 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723613 2559 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723616 2559 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723619 2559 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723622 2559 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723626 2559 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723630 2559 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723633 2559 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723636 2559 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723639 2559 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723642 2559 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:40.726532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723645 2559 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723648 2559 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723651 2559 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723655 2559 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723658 2559 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723661 2559 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723664 2559 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723667 2559 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723670 2559 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723675 2559 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723678 2559 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723681 2559 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723685 2559 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723688 2559 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723692 2559 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723694 2559 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723697 2559 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723700 2559 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723703 2559 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723706 2559 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723709 2559 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723712 2559 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723716 2559 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723719 2559 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723722 2559 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:40.727130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723728 2559 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723731 2559 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723734 2559 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723738 2559 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723742 2559 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723745 2559 flags.go:64] FLAG: --help="false" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723748 2559 flags.go:64] FLAG: --hostname-override="ip-10-0-129-4.ec2.internal" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723752 2559 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723755 2559 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723758 2559 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723761 2559 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723764 2559 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723767 2559 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723771 2559 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723773 2559 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723776 2559 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723779 2559 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723782 2559 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723785 2559 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723788 2559 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723791 2559 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723794 2559 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723797 2559 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723800 2559 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:40.727829 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723803 2559 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723806 2559 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723809 2559 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723814 2559 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723817 2559 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723820 2559 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723823 2559 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723826 2559 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723829 2559 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723832 2559 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723835 2559 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723840 2559 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723843 2559 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723848 2559 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723851 2559 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723854 2559 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723857 2559 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723864 2559 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723867 2559 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723870 2559 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723873 2559 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723880 2559 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723883 2559 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723886 2559 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:40.728402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723890 2559 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723893 2559 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723898 2559 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723901 2559 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723904 2559 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723907 2559 flags.go:64] FLAG: --port="10250" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723910 2559 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723913 2559 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07117da41609c8654" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723916 2559 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723919 2559 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723922 2559 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723925 2559 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723928 2559 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723932 2559 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723935 2559 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723938 2559 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723941 2559 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723944 2559 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723947 2559 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723950 2559 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723953 2559 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723957 2559 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723960 2559 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723963 2559 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723966 2559 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:40.729007 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723969 2559 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723973 2559 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723976 2559 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723979 2559 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723983 2559 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723986 2559 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723989 2559 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723992 2559 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723995 2559 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.723998 2559 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724001 2559 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724003 2559 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724009 2559 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724012 2559 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724014 2559 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724018 2559 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724021 2559 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724024 2559 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724027 2559 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724030 2559 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724035 2559 flags.go:64] FLAG: --v="2" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724039 2559 flags.go:64] FLAG: --version="false" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724043 2559 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724047 2559 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.724050 2559 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:40.729625 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724159 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724163 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724166 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724169 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724173 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724176 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724178 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724181 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724185 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724188 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724191 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724193 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724196 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724198 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724201 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724204 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724208 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724212 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724215 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:40.730262 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724219 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724223 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724226 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724229 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724232 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724235 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724238 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724240 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724244 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724247 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724249 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724252 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724255 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724257 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724260 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724262 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724265 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724268 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724271 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:40.730766 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724274 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724276 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724280 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724283 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724285 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724288 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724290 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724293 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724296 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724298 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724301 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724304 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724306 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724309 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724311 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724314 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724316 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724319 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724321 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724324 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:40.731261 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724326 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724330 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724333 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724336 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724338 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724341 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724343 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724346 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724348 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724351 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724353 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724356 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724359 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724361 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724365 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724367 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724370 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724372 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724375 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724377 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:40.731770 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724380 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724382 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724385 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724387 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724390 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724393 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724396 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.724398 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.725025 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.731557 2559 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.731570 2559 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731616 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731621 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731624 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731628 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731631 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:40.732280 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731634 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731637 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731640 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731643 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731646 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731648 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731651 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731654 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731657 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731660 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731663 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731666 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731668 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731671 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731673 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731676 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731678 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731681 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731683 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731686 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:40.732708 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731688 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731691 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731694 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731696 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731698 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731701 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731704 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731706 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731709 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731711 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731714 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731716 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731719 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731721 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731724 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731726 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731729 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731731 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731734 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731736 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:40.733201 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731739 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731743 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731746 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731748 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731751 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731753 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731755 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731758 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731761 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731763 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731766 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731768 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731771 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731774 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731777 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731780 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731782 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731785 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731787 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731790 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:40.733710 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731792 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731795 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731797 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731800 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731802 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731805 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731807 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731809 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731813 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731818 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731824 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731827 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731830 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731847 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731851 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731855 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731858 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731861 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731864 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:40.734203 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731867 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731870 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.731874 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731970 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731975 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731979 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731981 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731984 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731987 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731990 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731993 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731996 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.731998 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732001 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732003 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:40.734754 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732006 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732008 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732011 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732013 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732016 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732018 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732021 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732023 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732026 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732029 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732032 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732034 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732037 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732043 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732046 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732048 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732051 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732054 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732056 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732059 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:40.735118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732061 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732064 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732066 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732069 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732071 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732074 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732076 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732079 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732081 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732083 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732086 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732089 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732091 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732094 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732096 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732099 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732101 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732103 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732106 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732108 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:40.735628 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732111 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732113 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732116 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732118 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732121 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732123 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732127 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732129 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732132 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732134 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732136 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732139 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732142 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732144 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732146 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732149 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732151 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732154 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732156 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732159 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:40.736111 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732161 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732164 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732166 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732170 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732173 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732176 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732180 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732182 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732185 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732188 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732190 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732192 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732195 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:40.732198 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.732203 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:40.736643 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.733038 2559 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:40.737035 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.736547 2559 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:40.737732 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.737721 2559 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:40.737853 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.737828 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:40.739088 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.739076 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:40.767501 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.767479 2559 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:40.771375 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.771350 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:40.781804 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.781781 2559 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:40.789637 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.789621 2559 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:40.792370 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.792352 2559 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:40.795200 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.795184 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:40.796520 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.796501 2559 fs.go:135] Filesystem UUIDs: map[00196c3b-3021-49b7-b91e-57cb548ba186:/dev/nvme0n1p3 1d5cef22-b42e-42ce-9fa1-c62ac0cae9f1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 23:53:40.796571 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.796520 2559 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:40.803279 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.803173 2559 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:40.800504444 +0000 UTC m=+0.442419309 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2995477 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d1787356344b2fff445b40d04c6ff SystemUUID:ec2d1787-3563-44b2-fff4-45b40d04c6ff BootID:43949d39-3bdf-41fc-b19e-f1f5ac89bf81 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d7:d5:e6:22:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d7:d5:e6:22:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:5f:0e:92:05:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:40.803279 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.803274 2559 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:40.803392 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.803354 2559 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:40.804510 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.804484 2559 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:40.804645 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.804513 2559 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-4.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:40.804689 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.804654 2559 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:40.804689 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.804662 2559 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:40.804689 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.804675 2559 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:40.806537 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.806526 2559 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:40.808195 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.808185 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:40.808472 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.808451 2559 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:40.812691 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.812681 2559 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:40.812725 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.812695 2559 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:40.812725 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.812710 2559 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:40.812725 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.812718 2559 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:40.812808 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.812727 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:40.813976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.813965 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:40.814024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.813983 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:40.817004 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.816987 2559 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:40.818595 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.818571 2559 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:40.820760 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820743 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820766 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820773 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820779 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820785 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820792 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820799 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820805 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820815 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820822 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820830 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:40.820869 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.820840 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:40.822697 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.821709 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:40.822772 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.822709 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:40.826336 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.826319 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:40.826426 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.826355 2559 server.go:1295] "Started kubelet" Apr 24 23:53:40.826482 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.826444 2559 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:40.826568 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.826509 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:40.826604 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.826587 2559 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:40.826921 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.826902 2559 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-4.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:40.827013 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.826953 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-4.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:40.827013 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.827004 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:40.827069 ip-10-0-129-4 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:40.827652 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.827599 2559 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:40.827736 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.827721 2559 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:40.833625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.833600 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:40.834136 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.833142 2559 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-4.ec2.internal.18a970225aa4fa48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-4.ec2.internal,UID:ip-10-0-129-4.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-4.ec2.internal,},FirstTimestamp:2026-04-24 23:53:40.82633172 +0000 UTC m=+0.468246584,LastTimestamp:2026-04-24 23:53:40.82633172 +0000 UTC m=+0.468246584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-4.ec2.internal,}" Apr 24 23:53:40.834223 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.834213 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:40.834973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.834956 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:40.834973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.834961 2559 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:40.835100 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.834979 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:40.835179 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.835164 2559 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:40.835231 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.835182 2559 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:40.835856 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.835837 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:40.838142 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838124 2559 factory.go:153] Registering CRI-O factory Apr 24 23:53:40.838236 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838194 2559 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:40.838296 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838248 2559 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:40.838296 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838258 2559 factory.go:55] Registering systemd factory Apr 24 23:53:40.838296 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838266 2559 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:40.838296 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838293 2559 factory.go:103] Registering Raw factory Apr 24 23:53:40.838515 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838305 2559 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:40.838759 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.838745 2559 manager.go:319] Starting recovery of all containers Apr 24 23:53:40.841715 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.841693 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9cvdd" Apr 24 23:53:40.842141 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.841970 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-4.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 23:53:40.842141 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.842077 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 23:53:40.849783 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.849616 2559 manager.go:324] Recovery completed Apr 24 23:53:40.850127 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.849791 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9cvdd" Apr 24 23:53:40.852145 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.852117 2559 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 23:53:40.855168 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.855152 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:40.857507 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.857490 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:40.857581 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.857520 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:40.857581 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.857534 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:40.857957 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.857942 2559 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:40.858000 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.857957 2559 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:40.858000 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.857975 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:40.859348 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.859279 2559 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-4.ec2.internal.18a970225c80a328 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-4.ec2.internal,UID:ip-10-0-129-4.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-4.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-4.ec2.internal,},FirstTimestamp:2026-04-24 23:53:40.857504552 +0000 UTC m=+0.499419416,LastTimestamp:2026-04-24 23:53:40.857504552 +0000 UTC m=+0.499419416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-4.ec2.internal,}" Apr 24 23:53:40.860566 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.860553 2559 policy_none.go:49] "None policy: Start" Apr 24 23:53:40.860622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.860569 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:40.860622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.860579 2559 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:40.904029 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904008 2559 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.904055 2559 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904070 2559 server.go:85] "Starting device plugin registration server" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904320 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904330 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904423 2559 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904509 2559 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.904518 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.905014 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:40.910499 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.905045 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:40.930277 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.930249 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:40.931335 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.931317 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:40.931413 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.931341 2559 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:40.931413 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.931358 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:40.931413 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.931365 2559 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:40.931413 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:40.931407 2559 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:40.937431 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:40.937416 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:41.004757 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.004691 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.005892 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.005873 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.005996 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.005906 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.005996 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.005916 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.005996 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.005946 2559 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.014842 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.014820 2559 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.014842 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.014843 2559 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-4.ec2.internal\": node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.031718 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.031696 2559 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal"] Apr 24 23:53:41.031819 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.031772 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.031819 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.031790 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.032492 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.032478 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.032555 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.032503 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.032555 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.032513 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.033777 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.033766 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.033923 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.033909 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.033963 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.033936 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.034503 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.034488 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.034503 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.034494 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.034658 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.034515 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.034658 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.034516 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.034658 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.034538 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.034658 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.034526 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.035704 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.035688 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.035783 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.035716 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.036290 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.036277 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.036377 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.036302 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.036377 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.036314 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.062060 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.062038 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-4.ec2.internal\" not found" node="ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.066472 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.066438 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-4.ec2.internal\" not found" node="ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.132235 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.132210 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.136577 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.136558 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/30bae4ccdc5c5470ae5b607b233b33aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal\" (UID: \"30bae4ccdc5c5470ae5b607b233b33aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.136639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.136586 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30bae4ccdc5c5470ae5b607b233b33aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal\" (UID: \"30bae4ccdc5c5470ae5b607b233b33aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.136639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.136603 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b0e49161c603a5579b9b31b2ffe9b2e8-config\") pod \"kube-apiserver-proxy-ip-10-0-129-4.ec2.internal\" (UID: \"b0e49161c603a5579b9b31b2ffe9b2e8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.232698 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.232665 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.237957 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.237932 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b0e49161c603a5579b9b31b2ffe9b2e8-config\") pod \"kube-apiserver-proxy-ip-10-0-129-4.ec2.internal\" (UID: \"b0e49161c603a5579b9b31b2ffe9b2e8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.238058 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.237972 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/30bae4ccdc5c5470ae5b607b233b33aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal\" (UID: \"30bae4ccdc5c5470ae5b607b233b33aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.238058 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.237997 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30bae4ccdc5c5470ae5b607b233b33aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal\" (UID: \"30bae4ccdc5c5470ae5b607b233b33aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.238058 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.238025 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b0e49161c603a5579b9b31b2ffe9b2e8-config\") pod \"kube-apiserver-proxy-ip-10-0-129-4.ec2.internal\" (UID: \"b0e49161c603a5579b9b31b2ffe9b2e8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.238058 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.238033 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30bae4ccdc5c5470ae5b607b233b33aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal\" (UID: \"30bae4ccdc5c5470ae5b607b233b33aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.238058 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.238037 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/30bae4ccdc5c5470ae5b607b233b33aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal\" (UID: \"30bae4ccdc5c5470ae5b607b233b33aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.333509 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.333400 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.363904 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.363873 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.369709 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.369690 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.433734 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.433698 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.534234 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.534203 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.634669 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.634605 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.735058 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.735024 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.738231 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.738212 2559 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:41.738381 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.738362 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:41.834363 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.834335 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:41.835172 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:41.835155 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-4.ec2.internal\" not found" Apr 24 23:53:41.849569 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.849550 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:41.853296 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.853270 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:40 +0000 UTC" deadline="2027-10-01 00:14:59.33289009 +0000 UTC" Apr 24 23:53:41.853296 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.853293 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12576h21m17.479599151s" Apr 24 23:53:41.864655 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.864639 2559 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:41.875540 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:41.875510 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30bae4ccdc5c5470ae5b607b233b33aa.slice/crio-b425ef76fad838c5367d0ea30cd3c39daa61e12ab7acfd32a85b40ce7ded9a3a WatchSource:0}: Error finding container b425ef76fad838c5367d0ea30cd3c39daa61e12ab7acfd32a85b40ce7ded9a3a: Status 404 returned error can't find the container with id b425ef76fad838c5367d0ea30cd3c39daa61e12ab7acfd32a85b40ce7ded9a3a Apr 24 23:53:41.875753 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:41.875734 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e49161c603a5579b9b31b2ffe9b2e8.slice/crio-f62d032eb5fe640ec5313ef1c39c393ac2eba0a8dcb207d2c3ee2c6d49ff2ce0 WatchSource:0}: Error finding container f62d032eb5fe640ec5313ef1c39c393ac2eba0a8dcb207d2c3ee2c6d49ff2ce0: Status 404 returned error can't find the container with id f62d032eb5fe640ec5313ef1c39c393ac2eba0a8dcb207d2c3ee2c6d49ff2ce0 Apr 24 23:53:41.879434 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.879413 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:41.915579 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.915563 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6llz6" Apr 24 23:53:41.926724 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.926707 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6llz6" Apr 24 23:53:41.934429 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.934411 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" Apr 24 23:53:41.934697 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.934662 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" event={"ID":"30bae4ccdc5c5470ae5b607b233b33aa","Type":"ContainerStarted","Data":"b425ef76fad838c5367d0ea30cd3c39daa61e12ab7acfd32a85b40ce7ded9a3a"} Apr 24 23:53:41.935658 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.935635 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" event={"ID":"b0e49161c603a5579b9b31b2ffe9b2e8","Type":"ContainerStarted","Data":"f62d032eb5fe640ec5313ef1c39c393ac2eba0a8dcb207d2c3ee2c6d49ff2ce0"} Apr 24 23:53:41.940657 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:41.940643 2559 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:42.029091 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.029068 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:42.030074 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.030059 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" Apr 24 23:53:42.042589 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.042566 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:42.163031 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.163005 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:42.673381 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.673349 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:42.813839 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.813805 2559 apiserver.go:52] "Watching apiserver" Apr 24 23:53:42.826980 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.826955 2559 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:42.829464 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.829434 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bfwmd","kube-system/global-pull-secret-syncer-hb4xm","kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal","openshift-cluster-node-tuning-operator/tuned-5t5t4","openshift-multus/network-metrics-daemon-2s8sw","openshift-network-operator/iptables-alerter-skfzd","kube-system/konnectivity-agent-wjpkj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg","openshift-image-registry/node-ca-4glfj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal","openshift-multus/multus-additional-cni-plugins-shvmr","openshift-multus/multus-dz6nf","openshift-network-diagnostics/network-check-target-jz5tk"] Apr 24 23:53:42.832944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.832917 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.834442 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.834197 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.834442 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.834292 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:42.834442 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:42.834356 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:42.835630 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.835608 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.835747 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.835736 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:42.836068 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.836045 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:42.836165 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.836071 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dzpsh\"" Apr 24 23:53:42.837038 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.837021 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.837256 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.837232 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:42.837536 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.837500 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2b6p8\"" Apr 24 23:53:42.837769 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.837746 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:42.837944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.837920 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:42.838229 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.838212 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:42.838343 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.838240 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-g77b2\"" Apr 24 23:53:42.838486 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.838469 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:42.838796 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.838769 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.838938 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.838917 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:42.840583 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.840209 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:42.840583 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.840307 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:42.840583 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.840328 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r5w2g\"" Apr 24 23:53:42.840583 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.840341 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:42.840809 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.840747 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:42.841292 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.841100 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:42.841292 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.841211 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:42.841292 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.841243 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:42.841443 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.841409 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k2d9m\"" Apr 24 23:53:42.841538 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.841523 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:42.841538 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.841532 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:42.842378 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.842349 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.842974 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.842957 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:42.843064 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.843014 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:42.843132 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.843062 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:42.843132 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.843092 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-w8c42\"" Apr 24 23:53:42.843971 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.843951 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.844640 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.844622 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:42.844968 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.844953 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:42.845086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845067 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:42.845164 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845095 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:42.845164 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845115 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f7ts4\"" Apr 24 23:53:42.845164 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845145 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:42.845546 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845527 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:42.845655 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845549 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:42.845655 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845581 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnswm\" (UniqueName: \"kubernetes.io/projected/f8df9612-54a5-4673-b2cc-33d7768fe61c-kube-api-access-xnswm\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:42.845655 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:42.845599 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:42.845655 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845608 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-host-slash\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.845840 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845651 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8nn6\" (UniqueName: \"kubernetes.io/projected/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-kube-api-access-j8nn6\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.845840 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845723 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysconfig\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.845840 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845755 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysctl-conf\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.845840 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845785 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h4hc\" (UniqueName: \"kubernetes.io/projected/986136f8-eb91-4644-8a16-17e1b919fac0-kube-api-access-8h4hc\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845859 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-iptables-alerter-script\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845885 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-kubernetes\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845907 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-systemd\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845952 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-run\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845968 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-var-lib-kubelet\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.845987 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-host\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846007 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysctl-d\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846037 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-sys\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846056 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-56vnv\"" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846064 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/986136f8-eb91-4644-8a16-17e1b919fac0-tmp\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846067 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846085 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ef7298e-690a-414e-92a8-45d6a5710aa9-agent-certs\") pod \"konnectivity-agent-wjpkj\" (UID: \"0ef7298e-690a-414e-92a8-45d6a5710aa9\") " pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846110 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ef7298e-690a-414e-92a8-45d6a5710aa9-konnectivity-ca\") pod \"konnectivity-agent-wjpkj\" (UID: \"0ef7298e-690a-414e-92a8-45d6a5710aa9\") " pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846134 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-modprobe-d\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846148 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-lib-modules\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846295 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846162 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/986136f8-eb91-4644-8a16-17e1b919fac0-etc-tuned\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.846903 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.846863 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:42.846964 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:42.846917 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:42.927592 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.927519 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:41 +0000 UTC" deadline="2027-12-29 02:39:13.46983295 +0000 UTC" Apr 24 23:53:42.927592 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.927548 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14714h45m30.542288838s" Apr 24 23:53:42.936038 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.936020 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:42.947291 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947265 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-system-cni-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.947406 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947305 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-hostroot\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.947406 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947329 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-kubelet\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947406 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947352 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-systemd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947406 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947396 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-node-log\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947431 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovn-node-metrics-cert\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947482 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnswm\" (UniqueName: \"kubernetes.io/projected/f8df9612-54a5-4673-b2cc-33d7768fe61c-kube-api-access-xnswm\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947507 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8nn6\" (UniqueName: \"kubernetes.io/projected/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-kube-api-access-j8nn6\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947528 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysctl-conf\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947545 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h4hc\" (UniqueName: \"kubernetes.io/projected/986136f8-eb91-4644-8a16-17e1b919fac0-kube-api-access-8h4hc\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947569 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947614 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947610 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-device-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947652 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-run\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947678 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-var-lib-kubelet\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947705 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-cni-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947730 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-cnibin\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947756 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-systemd-units\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947764 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-var-lib-kubelet\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947767 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysctl-conf\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947779 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-log-socket\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947782 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-run\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947805 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/986136f8-eb91-4644-8a16-17e1b919fac0-tmp\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947831 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67b28161-03e9-4905-8e32-8b7353db6c58-kubelet-config\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947856 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947878 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-etc-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947903 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-ovn\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947927 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovnkube-script-lib\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947951 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbpwd\" (UniqueName: \"kubernetes.io/projected/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-kube-api-access-fbpwd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.947973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.947975 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/986136f8-eb91-4644-8a16-17e1b919fac0-etc-tuned\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948008 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948034 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-k8s-cni-cncf-io\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948058 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-cni-bin\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948100 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948141 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-registration-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948167 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjrr\" (UniqueName: \"kubernetes.io/projected/f3a90767-4e7f-42e0-8033-f0e9aba778bf-kube-api-access-6vjrr\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948191 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-host\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948182 2559 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948209 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-cni-multus\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948260 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-host\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948268 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tsmh\" (UniqueName: \"kubernetes.io/projected/aa3ab10f-a4b0-49f5-8458-86e3138f3237-kube-api-access-4tsmh\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948312 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-multus-certs\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948339 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-slash\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948368 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-modprobe-d\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948394 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-system-cni-dir\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948421 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc64b50b-da56-49cb-b2a2-054b925980cf-cni-binary-copy\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.948734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948446 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-sys-fs\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948487 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-os-release\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948493 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-modprobe-d\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948512 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w46bj\" (UniqueName: \"kubernetes.io/projected/991de149-fd35-4947-8e6c-35dfa11c084c-kube-api-access-w46bj\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948538 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948562 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-host-slash\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948620 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-run-netns\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948653 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948666 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-host-slash\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:42.948672 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948679 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948716 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-iptables-alerter-script\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:42.948767 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:43.448726022 +0000 UTC m=+3.090640889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948821 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-kubernetes\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948854 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948879 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-cni-bin\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.949433 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948902 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-etc-kubernetes\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948924 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-kubernetes\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948928 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f7j\" (UniqueName: \"kubernetes.io/projected/cc64b50b-da56-49cb-b2a2-054b925980cf-kube-api-access-w5f7j\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948950 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67b28161-03e9-4905-8e32-8b7353db6c58-dbus\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.948981 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ef7298e-690a-414e-92a8-45d6a5710aa9-agent-certs\") pod \"konnectivity-agent-wjpkj\" (UID: \"0ef7298e-690a-414e-92a8-45d6a5710aa9\") " pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949048 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ef7298e-690a-414e-92a8-45d6a5710aa9-konnectivity-ca\") pod \"konnectivity-agent-wjpkj\" (UID: \"0ef7298e-690a-414e-92a8-45d6a5710aa9\") " pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949153 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-lib-modules\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949202 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/991de149-fd35-4947-8e6c-35dfa11c084c-serviceca\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949228 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-os-release\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949261 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-kubelet\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949311 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-var-lib-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949339 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949364 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysconfig\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949390 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cni-binary-copy\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949416 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-tuning-conf-dir\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949419 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-lib-modules\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949442 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.950206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949495 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-daemon-config\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949542 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysconfig\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949574 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-cni-netd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949601 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovnkube-config\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949627 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-env-overrides\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949659 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-systemd\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949693 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ef7298e-690a-414e-92a8-45d6a5710aa9-konnectivity-ca\") pod \"konnectivity-agent-wjpkj\" (UID: \"0ef7298e-690a-414e-92a8-45d6a5710aa9\") " pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949705 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-systemd\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949718 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cnibin\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949747 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-socket-dir-parent\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949774 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-netns\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949798 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-conf-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949810 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-iptables-alerter-script\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949826 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-socket-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949862 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysctl-d\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949910 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-sys\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949935 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/991de149-fd35-4947-8e6c-35dfa11c084c-host\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:42.951037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.949997 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-sys\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951735 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.950009 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/986136f8-eb91-4644-8a16-17e1b919fac0-etc-sysctl-d\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951735 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.951407 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/986136f8-eb91-4644-8a16-17e1b919fac0-tmp\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951735 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.951482 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/986136f8-eb91-4644-8a16-17e1b919fac0-etc-tuned\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:42.951986 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.951965 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ef7298e-690a-414e-92a8-45d6a5710aa9-agent-certs\") pod \"konnectivity-agent-wjpkj\" (UID: \"0ef7298e-690a-414e-92a8-45d6a5710aa9\") " pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:42.957861 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.957832 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8nn6\" (UniqueName: \"kubernetes.io/projected/e2c1b3f5-cd6d-4849-a473-0eb71003f6b1-kube-api-access-j8nn6\") pod \"iptables-alerter-skfzd\" (UID: \"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1\") " pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:42.958141 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.958120 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnswm\" (UniqueName: \"kubernetes.io/projected/f8df9612-54a5-4673-b2cc-33d7768fe61c-kube-api-access-xnswm\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:42.958496 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:42.958476 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h4hc\" (UniqueName: \"kubernetes.io/projected/986136f8-eb91-4644-8a16-17e1b919fac0-kube-api-access-8h4hc\") pod \"tuned-5t5t4\" (UID: \"986136f8-eb91-4644-8a16-17e1b919fac0\") " pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:43.050687 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050653 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.050841 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050700 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cni-binary-copy\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.050841 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050722 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-tuning-conf-dir\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.050841 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050765 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.050841 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050829 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.051042 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050864 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-daemon-config\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051042 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050878 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-tuning-conf-dir\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.051042 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050906 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-cni-netd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051042 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.050948 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-cni-netd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051042 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051002 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovnkube-config\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051042 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051039 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-env-overrides\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051244 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051165 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cnibin\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.051244 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051193 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-socket-dir-parent\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051244 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051219 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-netns\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051243 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-conf-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051270 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-socket-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051296 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/991de149-fd35-4947-8e6c-35dfa11c084c-host\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051304 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-netns\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051302 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-socket-dir-parent\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051266 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cnibin\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051324 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-system-cni-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051360 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/991de149-fd35-4947-8e6c-35dfa11c084c-host\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051360 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-hostroot\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051390 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051379 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-system-cni-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051400 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051401 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-socket-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051403 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-daemon-config\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051424 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-hostroot\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051436 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-kubelet\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051490 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-kubelet\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051437 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-conf-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051512 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-systemd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051541 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-node-log\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051565 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovn-node-metrics-cert\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051575 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-node-log\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051582 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051542 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-systemd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051605 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-device-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051631 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-cni-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051637 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051652 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-cnibin\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.051824 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051676 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-systemd-units\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051681 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-device-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051691 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-multus-cni-dir\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051699 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-log-socket\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051728 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67b28161-03e9-4905-8e32-8b7353db6c58-kubelet-config\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051738 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-cnibin\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051754 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051757 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-log-socket\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051728 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-systemd-units\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051788 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67b28161-03e9-4905-8e32-8b7353db6c58-kubelet-config\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051803 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-etc-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051829 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-ovn\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051857 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-etc-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.051866 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051868 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cni-binary-copy\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051885 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-run-ovn\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.051919 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:43.551902066 +0000 UTC m=+3.193816916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:43.052639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051942 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovnkube-script-lib\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.051969 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbpwd\" (UniqueName: \"kubernetes.io/projected/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-kube-api-access-fbpwd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052005 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052044 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-k8s-cni-cncf-io\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052067 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-cni-bin\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052091 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052118 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-registration-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052142 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-k8s-cni-cncf-io\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052144 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjrr\" (UniqueName: \"kubernetes.io/projected/f3a90767-4e7f-42e0-8033-f0e9aba778bf-kube-api-access-6vjrr\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052304 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovnkube-config\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052304 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052303 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-env-overrides\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052341 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-cni-bin\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052360 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-cni-multus\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052368 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-registration-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052395 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tsmh\" (UniqueName: \"kubernetes.io/projected/aa3ab10f-a4b0-49f5-8458-86e3138f3237-kube-api-access-4tsmh\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052422 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-multus-certs\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.053419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052429 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-cni-multus\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052495 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-run-multus-certs\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052517 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovnkube-script-lib\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052528 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-slash\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052560 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-slash\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052561 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-system-cni-dir\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052591 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc64b50b-da56-49cb-b2a2-054b925980cf-cni-binary-copy\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052592 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-system-cni-dir\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052616 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-sys-fs\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052652 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-sys-fs\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052613 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa3ab10f-a4b0-49f5-8458-86e3138f3237-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052659 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-os-release\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052699 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w46bj\" (UniqueName: \"kubernetes.io/projected/991de149-fd35-4947-8e6c-35dfa11c084c-kube-api-access-w46bj\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052739 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-run-netns\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052745 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa3ab10f-a4b0-49f5-8458-86e3138f3237-os-release\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052765 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052785 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:43.054086 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052805 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-host-run-netns\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052831 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052844 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052853 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-cni-bin\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052871 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-etc-kubernetes\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052887 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f7j\" (UniqueName: \"kubernetes.io/projected/cc64b50b-da56-49cb-b2a2-054b925980cf-kube-api-access-w5f7j\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052905 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67b28161-03e9-4905-8e32-8b7353db6c58-dbus\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052930 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/991de149-fd35-4947-8e6c-35dfa11c084c-serviceca\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052952 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-os-release\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052950 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-etc-kubernetes\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052967 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-kubelet\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052986 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-var-lib-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053006 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-cni-bin\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053036 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-var-lib-openvswitch\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053039 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc64b50b-da56-49cb-b2a2-054b925980cf-cni-binary-copy\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.052965 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3a90767-4e7f-42e0-8033-f0e9aba778bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053048 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-host-var-lib-kubelet\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053083 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc64b50b-da56-49cb-b2a2-054b925980cf-os-release\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.054682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053173 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67b28161-03e9-4905-8e32-8b7353db6c58-dbus\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:43.055251 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.053374 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/991de149-fd35-4947-8e6c-35dfa11c084c-serviceca\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.055251 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.054422 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-ovn-node-metrics-cert\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.064055 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.064033 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:43.064055 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.064056 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:43.064055 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.064068 2559 projected.go:194] Error preparing data for projected volume kube-api-access-skk2h for pod openshift-network-diagnostics/network-check-target-jz5tk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:43.064285 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.064141 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h podName:800b9072-49a3-4275-947c-a73644d8448e nodeName:}" failed. No retries permitted until 2026-04-24 23:53:43.564123797 +0000 UTC m=+3.206038648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-skk2h" (UniqueName: "kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h") pod "network-check-target-jz5tk" (UID: "800b9072-49a3-4275-947c-a73644d8448e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:43.064285 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.064228 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbpwd\" (UniqueName: \"kubernetes.io/projected/3a0dad7b-4a0e-485f-9092-becacb1cd8a8-kube-api-access-fbpwd\") pod \"ovnkube-node-bfwmd\" (UID: \"3a0dad7b-4a0e-485f-9092-becacb1cd8a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.064407 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.064364 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w46bj\" (UniqueName: \"kubernetes.io/projected/991de149-fd35-4947-8e6c-35dfa11c084c-kube-api-access-w46bj\") pod \"node-ca-4glfj\" (UID: \"991de149-fd35-4947-8e6c-35dfa11c084c\") " pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.065238 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.065217 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f7j\" (UniqueName: \"kubernetes.io/projected/cc64b50b-da56-49cb-b2a2-054b925980cf-kube-api-access-w5f7j\") pod \"multus-dz6nf\" (UID: \"cc64b50b-da56-49cb-b2a2-054b925980cf\") " pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.065503 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.065434 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjrr\" (UniqueName: \"kubernetes.io/projected/f3a90767-4e7f-42e0-8033-f0e9aba778bf-kube-api-access-6vjrr\") pod \"aws-ebs-csi-driver-node-smrgg\" (UID: \"f3a90767-4e7f-42e0-8033-f0e9aba778bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.066271 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.066254 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tsmh\" (UniqueName: \"kubernetes.io/projected/aa3ab10f-a4b0-49f5-8458-86e3138f3237-kube-api-access-4tsmh\") pod \"multus-additional-cni-plugins-shvmr\" (UID: \"aa3ab10f-a4b0-49f5-8458-86e3138f3237\") " pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.144596 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.144555 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:53:43.152368 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.152339 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" Apr 24 23:53:43.161022 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.160999 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-skfzd" Apr 24 23:53:43.166652 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.166630 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:53:43.174271 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.174252 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" Apr 24 23:53:43.179883 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.179830 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4glfj" Apr 24 23:53:43.185374 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.185358 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-shvmr" Apr 24 23:53:43.188873 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.188856 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dz6nf" Apr 24 23:53:43.441667 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.441422 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a0dad7b_4a0e_485f_9092_becacb1cd8a8.slice/crio-f6e2591eec82946144a2202111b8dea1b99e22cc13e7d5c0948ac1b02e864dbe WatchSource:0}: Error finding container f6e2591eec82946144a2202111b8dea1b99e22cc13e7d5c0948ac1b02e864dbe: Status 404 returned error can't find the container with id f6e2591eec82946144a2202111b8dea1b99e22cc13e7d5c0948ac1b02e864dbe Apr 24 23:53:43.443067 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.443011 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef7298e_690a_414e_92a8_45d6a5710aa9.slice/crio-56487ba687c0bb985df6db4e31b6b4df0d3e9546eac3a8402cf9d3c455f5a801 WatchSource:0}: Error finding container 56487ba687c0bb985df6db4e31b6b4df0d3e9546eac3a8402cf9d3c455f5a801: Status 404 returned error can't find the container with id 56487ba687c0bb985df6db4e31b6b4df0d3e9546eac3a8402cf9d3c455f5a801 Apr 24 23:53:43.443686 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.443665 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc64b50b_da56_49cb_b2a2_054b925980cf.slice/crio-a8c615080a50c873c80504e3a7eacdd5a631e6c27257f862d9a6f0b5d1986a0c WatchSource:0}: Error finding container a8c615080a50c873c80504e3a7eacdd5a631e6c27257f862d9a6f0b5d1986a0c: Status 404 returned error can't find the container with id a8c615080a50c873c80504e3a7eacdd5a631e6c27257f862d9a6f0b5d1986a0c Apr 24 23:53:43.445920 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.445899 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991de149_fd35_4947_8e6c_35dfa11c084c.slice/crio-de43ebeb81165ed2b2a4e209dc1022c3dc82ae6c32e5cde4b166694970275572 WatchSource:0}: Error finding container de43ebeb81165ed2b2a4e209dc1022c3dc82ae6c32e5cde4b166694970275572: Status 404 returned error can't find the container with id de43ebeb81165ed2b2a4e209dc1022c3dc82ae6c32e5cde4b166694970275572 Apr 24 23:53:43.447133 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.447067 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986136f8_eb91_4644_8a16_17e1b919fac0.slice/crio-1f600a847e4407aa5f25d3d064b4916293b427526c0089583bce586ed514569a WatchSource:0}: Error finding container 1f600a847e4407aa5f25d3d064b4916293b427526c0089583bce586ed514569a: Status 404 returned error can't find the container with id 1f600a847e4407aa5f25d3d064b4916293b427526c0089583bce586ed514569a Apr 24 23:53:43.447802 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.447726 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa3ab10f_a4b0_49f5_8458_86e3138f3237.slice/crio-24907d23deaa5d1e305ff1777d9f90cb54ea0c849ba18749110c29c87d9eb9ba WatchSource:0}: Error finding container 24907d23deaa5d1e305ff1777d9f90cb54ea0c849ba18749110c29c87d9eb9ba: Status 404 returned error can't find the container with id 24907d23deaa5d1e305ff1777d9f90cb54ea0c849ba18749110c29c87d9eb9ba Apr 24 23:53:43.449083 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.449060 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c1b3f5_cd6d_4849_a473_0eb71003f6b1.slice/crio-589d8ef02a15615b19f6e37423c150ceecca29bc6fa4757f58c2623b0ab31c5e WatchSource:0}: Error finding container 589d8ef02a15615b19f6e37423c150ceecca29bc6fa4757f58c2623b0ab31c5e: Status 404 returned error can't find the container with id 589d8ef02a15615b19f6e37423c150ceecca29bc6fa4757f58c2623b0ab31c5e Apr 24 23:53:43.449996 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:53:43.449978 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a90767_4e7f_42e0_8033_f0e9aba778bf.slice/crio-31a500f9b0a28584f6b8da479b3c692a09a2c73e2358cbbe55154ced9341af85 WatchSource:0}: Error finding container 31a500f9b0a28584f6b8da479b3c692a09a2c73e2358cbbe55154ced9341af85: Status 404 returned error can't find the container with id 31a500f9b0a28584f6b8da479b3c692a09a2c73e2358cbbe55154ced9341af85 Apr 24 23:53:43.455981 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.455961 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:43.456118 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.456099 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:43.456201 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.456159 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.456143289 +0000 UTC m=+4.098058140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:43.557224 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.557201 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:43.557422 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.557347 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:43.557422 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.557406 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.557389408 +0000 UTC m=+4.199304260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:43.657625 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.657585 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:43.657782 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.657760 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:43.657854 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.657791 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:43.657854 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.657806 2559 projected.go:194] Error preparing data for projected volume kube-api-access-skk2h for pod openshift-network-diagnostics/network-check-target-jz5tk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:43.657960 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.657878 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h podName:800b9072-49a3-4275-947c-a73644d8448e nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.65785932 +0000 UTC m=+4.299774174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-skk2h" (UniqueName: "kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h") pod "network-check-target-jz5tk" (UID: "800b9072-49a3-4275-947c-a73644d8448e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:43.928538 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.928455 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:41 +0000 UTC" deadline="2027-11-12 00:19:33.734675866 +0000 UTC" Apr 24 23:53:43.928538 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.928507 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13584h25m49.806172925s" Apr 24 23:53:43.932245 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.931769 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:43.932245 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:43.931889 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:43.949615 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.949583 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" event={"ID":"f3a90767-4e7f-42e0-8033-f0e9aba778bf","Type":"ContainerStarted","Data":"31a500f9b0a28584f6b8da479b3c692a09a2c73e2358cbbe55154ced9341af85"} Apr 24 23:53:43.952177 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.952113 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-skfzd" event={"ID":"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1","Type":"ContainerStarted","Data":"589d8ef02a15615b19f6e37423c150ceecca29bc6fa4757f58c2623b0ab31c5e"} Apr 24 23:53:43.955202 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.955153 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" event={"ID":"986136f8-eb91-4644-8a16-17e1b919fac0","Type":"ContainerStarted","Data":"1f600a847e4407aa5f25d3d064b4916293b427526c0089583bce586ed514569a"} Apr 24 23:53:43.958050 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.957960 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dz6nf" event={"ID":"cc64b50b-da56-49cb-b2a2-054b925980cf","Type":"ContainerStarted","Data":"a8c615080a50c873c80504e3a7eacdd5a631e6c27257f862d9a6f0b5d1986a0c"} Apr 24 23:53:43.963444 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.963415 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wjpkj" event={"ID":"0ef7298e-690a-414e-92a8-45d6a5710aa9","Type":"ContainerStarted","Data":"56487ba687c0bb985df6db4e31b6b4df0d3e9546eac3a8402cf9d3c455f5a801"} Apr 24 23:53:43.965529 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.965504 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"f6e2591eec82946144a2202111b8dea1b99e22cc13e7d5c0948ac1b02e864dbe"} Apr 24 23:53:43.976472 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.976423 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" event={"ID":"b0e49161c603a5579b9b31b2ffe9b2e8","Type":"ContainerStarted","Data":"bed987ac3455d8e004f7a59582c95f3f1a2689bc68d09e92bdce6eff208f5d28"} Apr 24 23:53:43.979345 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.979321 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerStarted","Data":"24907d23deaa5d1e305ff1777d9f90cb54ea0c849ba18749110c29c87d9eb9ba"} Apr 24 23:53:43.992360 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:43.992333 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4glfj" event={"ID":"991de149-fd35-4947-8e6c-35dfa11c084c","Type":"ContainerStarted","Data":"de43ebeb81165ed2b2a4e209dc1022c3dc82ae6c32e5cde4b166694970275572"} Apr 24 23:53:44.468030 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:44.467996 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:44.468198 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.468158 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.468256 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.468223 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:46.468204742 +0000 UTC m=+6.110119616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.568989 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:44.568943 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:44.569146 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.569133 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:44.569226 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.569198 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:46.569178399 +0000 UTC m=+6.211093263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:44.670360 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:44.670320 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:44.670545 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.670521 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:44.670545 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.670544 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:44.670677 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.670558 2559 projected.go:194] Error preparing data for projected volume kube-api-access-skk2h for pod openshift-network-diagnostics/network-check-target-jz5tk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.670677 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.670620 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h podName:800b9072-49a3-4275-947c-a73644d8448e nodeName:}" failed. No retries permitted until 2026-04-24 23:53:46.670601813 +0000 UTC m=+6.312516685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-skk2h" (UniqueName: "kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h") pod "network-check-target-jz5tk" (UID: "800b9072-49a3-4275-947c-a73644d8448e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.934150 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:44.933174 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:44.934150 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.933356 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:44.934150 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:44.933900 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:44.934150 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:44.934031 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:45.006821 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:45.005607 2559 generic.go:358] "Generic (PLEG): container finished" podID="30bae4ccdc5c5470ae5b607b233b33aa" containerID="6f7c7a46717b386c547805830f5291a8cfb446b56af548cd5cba9edd5a8476b4" exitCode=0 Apr 24 23:53:45.006821 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:45.006584 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" event={"ID":"30bae4ccdc5c5470ae5b607b233b33aa","Type":"ContainerDied","Data":"6f7c7a46717b386c547805830f5291a8cfb446b56af548cd5cba9edd5a8476b4"} Apr 24 23:53:45.020248 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:45.020198 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-4.ec2.internal" podStartSLOduration=3.020180698 podStartE2EDuration="3.020180698s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:43.989602162 +0000 UTC m=+3.631517036" watchObservedRunningTime="2026-04-24 23:53:45.020180698 +0000 UTC m=+4.662095573" Apr 24 23:53:45.932113 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:45.932077 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:45.932304 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:45.932201 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:46.013709 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.013671 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" event={"ID":"30bae4ccdc5c5470ae5b607b233b33aa","Type":"ContainerStarted","Data":"52b1456ca436a3107440f6a09808f1090731227a8b4c22817f142f4bc5313b60"} Apr 24 23:53:46.029218 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.029155 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-4.ec2.internal" podStartSLOduration=4.029136814 podStartE2EDuration="4.029136814s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:46.02858194 +0000 UTC m=+5.670496814" watchObservedRunningTime="2026-04-24 23:53:46.029136814 +0000 UTC m=+5.671051691" Apr 24 23:53:46.486546 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.486499 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:46.486716 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.486700 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:46.486775 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.486762 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.486744118 +0000 UTC m=+10.128658975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:46.588260 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.587717 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:46.588260 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.587884 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:46.588260 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.587941 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.587924311 +0000 UTC m=+10.229839167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:46.689124 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.689091 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:46.689295 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.689254 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:46.689295 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.689276 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:46.689295 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.689288 2559 projected.go:194] Error preparing data for projected volume kube-api-access-skk2h for pod openshift-network-diagnostics/network-check-target-jz5tk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:46.689488 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.689342 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h podName:800b9072-49a3-4275-947c-a73644d8448e nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.689323494 +0000 UTC m=+10.331238349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-skk2h" (UniqueName: "kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h") pod "network-check-target-jz5tk" (UID: "800b9072-49a3-4275-947c-a73644d8448e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:46.932357 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.932324 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:46.932548 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.932478 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:46.932791 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:46.932763 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:46.932896 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:46.932877 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:47.932157 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:47.932125 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:47.932619 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:47.932281 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:48.931895 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:48.931852 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:48.931895 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:48.931897 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:48.932130 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:48.932003 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:48.932413 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:48.932344 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:49.932329 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:49.932265 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:49.932524 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:49.932396 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:50.522741 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:50.522700 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:50.522940 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.522846 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:50.522940 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.522913 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:53:58.522896909 +0000 UTC m=+18.164811759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:50.624104 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:50.623897 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:50.624104 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.624086 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:50.624318 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.624156 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:58.62413591 +0000 UTC m=+18.266050769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:50.724982 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:50.724941 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:50.725166 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.725100 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:50.725166 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.725117 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:50.725166 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.725129 2559 projected.go:194] Error preparing data for projected volume kube-api-access-skk2h for pod openshift-network-diagnostics/network-check-target-jz5tk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:50.725335 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.725184 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h podName:800b9072-49a3-4275-947c-a73644d8448e nodeName:}" failed. No retries permitted until 2026-04-24 23:53:58.725167779 +0000 UTC m=+18.367082630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-skk2h" (UniqueName: "kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h") pod "network-check-target-jz5tk" (UID: "800b9072-49a3-4275-947c-a73644d8448e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:50.932605 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:50.932567 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:50.933049 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.932699 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:50.934043 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:50.934022 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:50.934142 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:50.934122 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:51.932146 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:51.932099 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:51.932335 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:51.932232 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:52.931905 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:52.931870 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:52.932363 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:52.931913 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:52.932363 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:52.932017 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:52.932363 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:52.932157 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:53.932211 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:53.932174 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:53.932630 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:53.932287 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:54.932064 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:54.932028 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:54.932246 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:54.932031 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:54.932246 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:54.932166 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:54.932246 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:54.932229 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:55.931957 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:55.931921 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:55.932145 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:55.932043 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:56.931957 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:56.931924 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:56.932338 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:56.931925 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:56.932338 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:56.932041 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:56.932338 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:56.932107 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:57.932334 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:57.932299 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:57.932781 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:57.932453 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:58.584887 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:58.584854 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:58.585076 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.584975 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:58.585076 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.585032 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.585014268 +0000 UTC m=+34.226929119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:58.685351 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:58.685317 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:58.685531 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.685421 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:58.685531 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.685490 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.685475891 +0000 UTC m=+34.327390743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:58.785807 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:58.785771 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:58.785976 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.785934 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:58.785976 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.785952 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:58.785976 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.785964 2559 projected.go:194] Error preparing data for projected volume kube-api-access-skk2h for pod openshift-network-diagnostics/network-check-target-jz5tk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:58.786086 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.786029 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h podName:800b9072-49a3-4275-947c-a73644d8448e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.786011236 +0000 UTC m=+34.427926121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-skk2h" (UniqueName: "kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h") pod "network-check-target-jz5tk" (UID: "800b9072-49a3-4275-947c-a73644d8448e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:58.932568 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:58.932531 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:53:58.932568 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:58.932560 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:53:58.933073 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.932678 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:53:58.933073 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:58.932817 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:53:59.924064 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.924036 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wkhjm"] Apr 24 23:53:59.943690 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.943667 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:53:59.944084 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:53:59.943751 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:53:59.944084 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.943832 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:53:59.946653 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.946636 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-l5cdl\"" Apr 24 23:53:59.947183 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.947167 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:59.947583 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.947568 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:59.992016 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.991992 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3189ee75-b94d-4dc4-a4f0-5805c80f852c-hosts-file\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:53:59.992110 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.992045 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczdw\" (UniqueName: \"kubernetes.io/projected/3189ee75-b94d-4dc4-a4f0-5805c80f852c-kube-api-access-bczdw\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:53:59.992110 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:53:59.992102 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3189ee75-b94d-4dc4-a4f0-5805c80f852c-tmp-dir\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.093183 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.093153 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3189ee75-b94d-4dc4-a4f0-5805c80f852c-tmp-dir\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.093327 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.093197 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3189ee75-b94d-4dc4-a4f0-5805c80f852c-hosts-file\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.093327 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.093261 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bczdw\" (UniqueName: \"kubernetes.io/projected/3189ee75-b94d-4dc4-a4f0-5805c80f852c-kube-api-access-bczdw\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.093410 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.093361 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3189ee75-b94d-4dc4-a4f0-5805c80f852c-hosts-file\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.093486 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.093455 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3189ee75-b94d-4dc4-a4f0-5805c80f852c-tmp-dir\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.102504 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.102483 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczdw\" (UniqueName: \"kubernetes.io/projected/3189ee75-b94d-4dc4-a4f0-5805c80f852c-kube-api-access-bczdw\") pod \"node-resolver-wkhjm\" (UID: \"3189ee75-b94d-4dc4-a4f0-5805c80f852c\") " pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.252104 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.251827 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wkhjm" Apr 24 23:54:00.273810 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:00.273573 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3189ee75_b94d_4dc4_a4f0_5805c80f852c.slice/crio-453b0fde090155334d2bfae1ea5cf857ea9a48573fd84302f5c2951a7c591d72 WatchSource:0}: Error finding container 453b0fde090155334d2bfae1ea5cf857ea9a48573fd84302f5c2951a7c591d72: Status 404 returned error can't find the container with id 453b0fde090155334d2bfae1ea5cf857ea9a48573fd84302f5c2951a7c591d72 Apr 24 23:54:00.932402 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.932233 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:00.932584 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:00.932303 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:00.932584 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:00.932512 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:00.932698 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:00.932580 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:01.041226 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.041135 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" event={"ID":"f3a90767-4e7f-42e0-8033-f0e9aba778bf","Type":"ContainerStarted","Data":"8bb694451020c281854546229135f3392a08eefa9133dbbc57c923a57de4fea9"} Apr 24 23:54:01.042728 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.042695 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" event={"ID":"986136f8-eb91-4644-8a16-17e1b919fac0","Type":"ContainerStarted","Data":"895f05063b8da4ec0f952f32bedd6e321c35e4d3414d870ab2dbd366cb1a80da"} Apr 24 23:54:01.044122 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.044084 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dz6nf" event={"ID":"cc64b50b-da56-49cb-b2a2-054b925980cf","Type":"ContainerStarted","Data":"2e566e2e1dcf0b75766d45ac541a4c449b7cbca743db547221586dfaf7a09a63"} Apr 24 23:54:01.045495 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.045446 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wjpkj" event={"ID":"0ef7298e-690a-414e-92a8-45d6a5710aa9","Type":"ContainerStarted","Data":"0c55c530cd5dd2913fc536ab6ba402bdcb2ffa70d0a852d5a13b4a5f99e41e50"} Apr 24 23:54:01.048030 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048011 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 24 23:54:01.048360 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048337 2559 generic.go:358] "Generic (PLEG): container finished" podID="3a0dad7b-4a0e-485f-9092-becacb1cd8a8" containerID="1c48cbd753e13564358a0d2326a3d9f9db29e401f9bebf7ba24bb00a87a8f480" exitCode=1 Apr 24 23:54:01.048445 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048400 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"67e865c444ecd7002ff9f88715d797699ac65f6bc2b6afcad160dd0ab3756d41"} Apr 24 23:54:01.048445 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048423 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"16d470822194bc68d21f2a3af960d958bb5eb2fce195379933b6dc317d669d1c"} Apr 24 23:54:01.048445 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048436 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"fa0cbe757fa194f1f892c8cf58f9515637123ddb242c6c9b0a35d07b9b0dd13b"} Apr 24 23:54:01.048594 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048448 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"94fab146e11bb16a2b9bd000235136fa3c9e572b405f1cd7b64d6606af3fa5da"} Apr 24 23:54:01.048594 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048479 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerDied","Data":"1c48cbd753e13564358a0d2326a3d9f9db29e401f9bebf7ba24bb00a87a8f480"} Apr 24 23:54:01.048594 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.048493 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"26b2e1bd90a175e28d7de4f0e645e57b95a4dea1f93bd736c9834256dfb65e31"} Apr 24 23:54:01.049736 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.049706 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wkhjm" event={"ID":"3189ee75-b94d-4dc4-a4f0-5805c80f852c","Type":"ContainerStarted","Data":"9d090185786436f5c55a41c7c67fdfb61e75cb5f6867975ad8a2930956683759"} Apr 24 23:54:01.049846 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.049739 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wkhjm" event={"ID":"3189ee75-b94d-4dc4-a4f0-5805c80f852c","Type":"ContainerStarted","Data":"453b0fde090155334d2bfae1ea5cf857ea9a48573fd84302f5c2951a7c591d72"} Apr 24 23:54:01.051205 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.051119 2559 generic.go:358] "Generic (PLEG): container finished" podID="aa3ab10f-a4b0-49f5-8458-86e3138f3237" containerID="ed1f9e8981ec9a629c22092096848fa3c2254cd4dd3eb57969a0468b36b478e5" exitCode=0 Apr 24 23:54:01.051205 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.051171 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerDied","Data":"ed1f9e8981ec9a629c22092096848fa3c2254cd4dd3eb57969a0468b36b478e5"} Apr 24 23:54:01.052617 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.052581 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4glfj" event={"ID":"991de149-fd35-4947-8e6c-35dfa11c084c","Type":"ContainerStarted","Data":"2322a55a61b245c008a850850c8dfc1ef35e679c47ed413fb791441de2d06049"} Apr 24 23:54:01.078022 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.077979 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5t5t4" podStartSLOduration=3.358004979 podStartE2EDuration="20.077965699s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.448969098 +0000 UTC m=+3.090883962" lastFinishedPulling="2026-04-24 23:54:00.168929817 +0000 UTC m=+19.810844682" observedRunningTime="2026-04-24 23:54:01.076879506 +0000 UTC m=+20.718794378" watchObservedRunningTime="2026-04-24 23:54:01.077965699 +0000 UTC m=+20.719880574" Apr 24 23:54:01.140343 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.140283 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dz6nf" podStartSLOduration=3.102014846 podStartE2EDuration="20.140265715s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.44583709 +0000 UTC m=+3.087751940" lastFinishedPulling="2026-04-24 23:54:00.484087956 +0000 UTC m=+20.126002809" observedRunningTime="2026-04-24 23:54:01.139021805 +0000 UTC m=+20.780936715" watchObservedRunningTime="2026-04-24 23:54:01.140265715 +0000 UTC m=+20.782180590" Apr 24 23:54:01.172351 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.172305 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wkhjm" podStartSLOduration=2.17229086 podStartE2EDuration="2.17229086s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:01.172046821 +0000 UTC m=+20.813961921" watchObservedRunningTime="2026-04-24 23:54:01.17229086 +0000 UTC m=+20.814205732" Apr 24 23:54:01.195138 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.195097 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wjpkj" podStartSLOduration=3.471060961 podStartE2EDuration="20.195083406s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.44475474 +0000 UTC m=+3.086669605" lastFinishedPulling="2026-04-24 23:54:00.168777194 +0000 UTC m=+19.810692050" observedRunningTime="2026-04-24 23:54:01.195010581 +0000 UTC m=+20.836925455" watchObservedRunningTime="2026-04-24 23:54:01.195083406 +0000 UTC m=+20.836998275" Apr 24 23:54:01.215315 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.215275 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4glfj" podStartSLOduration=3.494261661 podStartE2EDuration="20.21526255s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.447971756 +0000 UTC m=+3.089886610" lastFinishedPulling="2026-04-24 23:54:00.168972643 +0000 UTC m=+19.810887499" observedRunningTime="2026-04-24 23:54:01.214876927 +0000 UTC m=+20.856791800" watchObservedRunningTime="2026-04-24 23:54:01.21526255 +0000 UTC m=+20.857177425" Apr 24 23:54:01.326708 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.326680 2559 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:54:01.916796 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.916679 2559 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:54:01.326703603Z","UUID":"6b2c4eee-7676-450c-ba04-feca8a11c506","Handler":null,"Name":"","Endpoint":""} Apr 24 23:54:01.918551 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.918527 2559 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:54:01.918551 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.918560 2559 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:54:01.931643 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:01.931574 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:01.931775 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:01.931688 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:02.056928 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:02.056831 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" event={"ID":"f3a90767-4e7f-42e0-8033-f0e9aba778bf","Type":"ContainerStarted","Data":"bd13ee59603d35c8225636d856480a8a6604e286a378007838bd557f3cfe8423"} Apr 24 23:54:02.058412 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:02.058365 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-skfzd" event={"ID":"e2c1b3f5-cd6d-4849-a473-0eb71003f6b1","Type":"ContainerStarted","Data":"cbb031a71bb58e486eb5236425f5a93169e2e986e9e9ef3ff136be5c00dd4008"} Apr 24 23:54:02.072264 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:02.072217 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-skfzd" podStartSLOduration=4.354659706 podStartE2EDuration="21.072202733s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.451231376 +0000 UTC m=+3.093146240" lastFinishedPulling="2026-04-24 23:54:00.168774403 +0000 UTC m=+19.810689267" observedRunningTime="2026-04-24 23:54:02.071701563 +0000 UTC m=+21.713616438" watchObservedRunningTime="2026-04-24 23:54:02.072202733 +0000 UTC m=+21.714117626" Apr 24 23:54:02.935964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:02.935934 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:02.935964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:02.935951 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:02.936219 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:02.936050 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:02.936219 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:02.936165 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:03.062945 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:03.062872 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 24 23:54:03.063579 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:03.063259 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"7830d8060ab3b0f0b8d5f8138dc9ee406ebd8620ee53843b03be946d917a7813"} Apr 24 23:54:03.065280 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:03.065240 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" event={"ID":"f3a90767-4e7f-42e0-8033-f0e9aba778bf","Type":"ContainerStarted","Data":"195560bce9f04bfd76baedbb17795915c16bb2d58294d09d0d68dc8a9553854e"} Apr 24 23:54:03.080633 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:03.080588 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smrgg" podStartSLOduration=3.447152257 podStartE2EDuration="22.080575258s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.452238882 +0000 UTC m=+3.094153733" lastFinishedPulling="2026-04-24 23:54:02.085661869 +0000 UTC m=+21.727576734" observedRunningTime="2026-04-24 23:54:03.080087449 +0000 UTC m=+22.722002323" watchObservedRunningTime="2026-04-24 23:54:03.080575258 +0000 UTC m=+22.722490160" Apr 24 23:54:03.932517 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:03.932487 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:03.932699 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:03.932607 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:04.934301 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:04.934276 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:04.934810 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:04.934274 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:04.934810 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:04.934401 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:04.934810 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:04.934528 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:05.072452 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.072429 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 24 23:54:05.072798 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.072776 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"3992263bbc20f1fe313ddae319802beb41692171f64c2f8210695b9284afac53"} Apr 24 23:54:05.073063 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.073049 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:54:05.073117 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.073071 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:54:05.073214 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.073199 2559 scope.go:117] "RemoveContainer" containerID="1c48cbd753e13564358a0d2326a3d9f9db29e401f9bebf7ba24bb00a87a8f480" Apr 24 23:54:05.090051 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.090025 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:54:05.507387 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.507355 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:54:05.507979 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.507962 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:54:05.932308 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:05.932285 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:05.932439 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:05.932384 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:06.077999 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.077976 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 24 23:54:06.078416 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.078273 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" event={"ID":"3a0dad7b-4a0e-485f-9092-becacb1cd8a8","Type":"ContainerStarted","Data":"9f17918c8ae3cf64f9069a431555a7529f8f7f9a81657b7ee11423aef149f25b"} Apr 24 23:54:06.078554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.078531 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:54:06.079989 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.079968 2559 generic.go:358] "Generic (PLEG): container finished" podID="aa3ab10f-a4b0-49f5-8458-86e3138f3237" containerID="e66b3e54a7227312355c1120b5e4d35a86614fef9d718d08891b7e9f75126e3a" exitCode=0 Apr 24 23:54:06.080107 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.080058 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerDied","Data":"e66b3e54a7227312355c1120b5e4d35a86614fef9d718d08891b7e9f75126e3a"} Apr 24 23:54:06.080318 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.080303 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:54:06.081385 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.080743 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wjpkj" Apr 24 23:54:06.094746 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.094727 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:54:06.108777 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.108737 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" podStartSLOduration=8.316603456 podStartE2EDuration="25.10872729s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.443370528 +0000 UTC m=+3.085285396" lastFinishedPulling="2026-04-24 23:54:00.235494378 +0000 UTC m=+19.877409230" observedRunningTime="2026-04-24 23:54:06.107600048 +0000 UTC m=+25.749514920" watchObservedRunningTime="2026-04-24 23:54:06.10872729 +0000 UTC m=+25.750642162" Apr 24 23:54:06.926448 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.926220 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hb4xm"] Apr 24 23:54:06.926599 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.926585 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:06.926699 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:06.926680 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:06.929520 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.929496 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jz5tk"] Apr 24 23:54:06.929667 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.929600 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:06.929730 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:06.929681 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:06.930088 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.930068 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2s8sw"] Apr 24 23:54:06.930188 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:06.930170 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:06.930289 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:06.930270 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:07.084181 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:07.084105 2559 generic.go:358] "Generic (PLEG): container finished" podID="aa3ab10f-a4b0-49f5-8458-86e3138f3237" containerID="cb380db60392c8a5b79ee45b0f4848f22d9d837fcd8466b5d9dffcc814dd7c8c" exitCode=0 Apr 24 23:54:07.084526 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:07.084184 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerDied","Data":"cb380db60392c8a5b79ee45b0f4848f22d9d837fcd8466b5d9dffcc814dd7c8c"} Apr 24 23:54:08.087896 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:08.087821 2559 generic.go:358] "Generic (PLEG): container finished" podID="aa3ab10f-a4b0-49f5-8458-86e3138f3237" containerID="b76a4929539488c1d936d54a1b04cc3b8a6cccc7b950fc9dbb050c577abb77ec" exitCode=0 Apr 24 23:54:08.088228 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:08.087908 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerDied","Data":"b76a4929539488c1d936d54a1b04cc3b8a6cccc7b950fc9dbb050c577abb77ec"} Apr 24 23:54:08.931676 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:08.931642 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:08.931676 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:08.931662 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:08.931865 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:08.931764 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:08.931906 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:08.931860 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:08.932101 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:08.932084 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:08.932204 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:08.932183 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:10.933506 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:10.933380 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:10.933506 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:10.933415 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:10.934249 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:10.933523 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:10.934249 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:10.933564 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:10.934249 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:10.933600 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:10.934249 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:10.933677 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:12.931897 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:12.931854 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:12.931897 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:12.931878 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:12.932397 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:12.931859 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:12.932397 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:12.931990 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hb4xm" podUID="67b28161-03e9-4905-8e32-8b7353db6c58" Apr 24 23:54:12.932397 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:12.932096 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2s8sw" podUID="f8df9612-54a5-4673-b2cc-33d7768fe61c" Apr 24 23:54:12.932397 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:12.932175 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jz5tk" podUID="800b9072-49a3-4275-947c-a73644d8448e" Apr 24 23:54:13.200211 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.200181 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-4.ec2.internal" event="NodeReady" Apr 24 23:54:13.200379 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.200322 2559 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:13.240094 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.240058 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5797fc655d-8wv55"] Apr 24 23:54:13.266330 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.266206 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4x6ss"] Apr 24 23:54:13.266497 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.266339 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.269866 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.269824 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:54:13.270001 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.269974 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68gn9\"" Apr 24 23:54:13.270075 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.270021 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:54:13.270160 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.270141 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:54:13.276598 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.276576 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:54:13.292072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.292019 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g"] Apr 24 23:54:13.292203 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.292185 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.296651 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.296603 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 23:54:13.296951 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.296819 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.296951 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.296918 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.299013 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.298023 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 23:54:13.305388 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.305366 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lj5ln\"" Apr 24 23:54:13.312166 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.312126 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm"] Apr 24 23:54:13.312271 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.312258 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.312532 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.312508 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 23:54:13.314924 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.314904 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dcbd2\"" Apr 24 23:54:13.315500 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.315214 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 23:54:13.315500 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.315278 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.315639 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.315507 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.316338 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.316306 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 23:54:13.327164 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.327138 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-84bfffbb-vgvrl"] Apr 24 23:54:13.327568 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.327547 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" Apr 24 23:54:13.330316 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.330298 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6jhsz\"" Apr 24 23:54:13.330316 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.330309 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.330675 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.330658 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.343221 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.343200 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cmhdt"] Apr 24 23:54:13.343365 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.343344 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.346900 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.346880 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 23:54:13.346900 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.346897 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.347013 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.346931 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hxxq5\"" Apr 24 23:54:13.347013 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.346891 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 23:54:13.347228 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.347211 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 23:54:13.347383 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.347266 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.347567 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.347547 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 23:54:13.354278 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.354259 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5797fc655d-8wv55"] Apr 24 23:54:13.354372 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.354288 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm"] Apr 24 23:54:13.354445 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.354424 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.358560 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.358538 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:13.359345 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.359325 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:13.359437 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.359369 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xmfk4\"" Apr 24 23:54:13.367525 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.367507 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-l4ktn"] Apr 24 23:54:13.367666 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.367648 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:13.370301 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.370281 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.370403 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.370370 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.370403 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.370391 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-w5gns\"" Apr 24 23:54:13.372346 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.370307 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 23:54:13.380476 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.380437 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx"] Apr 24 23:54:13.380597 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.380578 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.383257 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.383236 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 23:54:13.383877 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.383819 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 23:54:13.383964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.383890 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.383964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.383904 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.384559 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.384540 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bglmj\"" Apr 24 23:54:13.390759 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.390738 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 23:54:13.394613 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.394592 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn"] Apr 24 23:54:13.394758 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.394741 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.396772 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396689 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f39c93ca-0e47-4091-b7b7-80b2901e8795-tmp\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.396772 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396722 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chg75\" (UniqueName: \"kubernetes.io/projected/f39c93ca-0e47-4091-b7b7-80b2901e8795-kube-api-access-chg75\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.396772 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396750 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.397002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396850 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f39c93ca-0e47-4091-b7b7-80b2901e8795-snapshots\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.397002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396884 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39c93ca-0e47-4091-b7b7-80b2901e8795-serving-cert\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.397002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396912 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9h2\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-kube-api-access-pp9h2\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396933 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpcm\" (UniqueName: \"kubernetes.io/projected/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-kube-api-access-8qpcm\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.397002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396954 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-certificates\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.396973 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39c93ca-0e47-4091-b7b7-80b2901e8795-service-ca-bundle\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397009 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397052 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707e7bff-7937-422a-9cfa-268de45b3dd6-ca-trust-extracted\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397084 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-trusted-ca\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397115 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-installation-pull-secrets\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397145 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397175 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39c93ca-0e47-4091-b7b7-80b2901e8795-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397199 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-image-registry-private-configuration\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397224 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-bound-sa-token\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397275 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 23:54:13.397322 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397300 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.397825 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397397 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 23:54:13.397825 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397550 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mfvr4\"" Apr 24 23:54:13.397825 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.397791 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.411606 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.411584 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pwflc"] Apr 24 23:54:13.411675 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.411618 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.415819 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.415348 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-k2q7w\"" Apr 24 23:54:13.416248 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.416226 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.416634 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.416612 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.417346 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.416645 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 23:54:13.417346 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.416696 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 23:54:13.421959 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.421940 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g"] Apr 24 23:54:13.422039 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.421968 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p"] Apr 24 23:54:13.422099 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.422085 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:13.424606 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.424589 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z44xn\"" Apr 24 23:54:13.425441 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.425428 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 23:54:13.425533 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.425429 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 23:54:13.438093 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.438077 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tdws9"] Apr 24 23:54:13.438231 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.438216 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" Apr 24 23:54:13.440610 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.440592 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-x479c\"" Apr 24 23:54:13.440700 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.440637 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.440700 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.440687 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.452706 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452657 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4x6ss"] Apr 24 23:54:13.452706 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452678 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cmhdt"] Apr 24 23:54:13.452706 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452688 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tdws9"] Apr 24 23:54:13.452706 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452696 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm"] Apr 24 23:54:13.452706 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452706 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452714 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452721 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452732 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pwflc"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452740 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452749 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84bfffbb-vgvrl"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452758 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-l4ktn"] Apr 24 23:54:13.452964 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.452737 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:13.455169 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.455150 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:13.455261 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.455152 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:13.455261 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.455202 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:13.455365 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.455202 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mn69n\"" Apr 24 23:54:13.498219 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498194 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t9nv\" (UniqueName: \"kubernetes.io/projected/19bb6c72-6566-4c92-b004-41d6f12a658e-kube-api-access-2t9nv\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.498318 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498234 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-certificates\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.498318 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498260 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrbt\" (UniqueName: \"kubernetes.io/projected/c1d5758c-6883-4cf2-be1b-364659aa4379-kube-api-access-6wrbt\") pod \"volume-data-source-validator-7c6cbb6c87-nk5lm\" (UID: \"c1d5758c-6883-4cf2-be1b-364659aa4379\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" Apr 24 23:54:13.498391 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498309 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.498391 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498348 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpcx\" (UniqueName: \"kubernetes.io/projected/f8c356cb-0fee-47ee-a119-26d729d14274-kube-api-access-nnpcx\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.498391 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498369 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707e7bff-7937-422a-9cfa-268de45b3dd6-ca-trust-extracted\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.498391 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498386 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c356cb-0fee-47ee-a119-26d729d14274-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.498607 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498428 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-trusted-ca\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.498607 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498455 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.498607 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.498504 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:13.498607 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.498520 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5797fc655d-8wv55: secret "image-registry-tls" not found Apr 24 23:54:13.498607 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.498572 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls podName:707e7bff-7937-422a-9cfa-268de45b3dd6 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:13.998551422 +0000 UTC m=+33.640466286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls") pod "image-registry-5797fc655d-8wv55" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6") : secret "image-registry-tls" not found Apr 24 23:54:13.498607 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498506 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498643 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7t4x\" (UniqueName: \"kubernetes.io/projected/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-kube-api-access-f7t4x\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498683 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39c93ca-0e47-4091-b7b7-80b2901e8795-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498723 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-bound-sa-token\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498731 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707e7bff-7937-422a-9cfa-268de45b3dd6-ca-trust-extracted\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498750 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-config\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498785 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498836 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-certificates\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498866 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f39c93ca-0e47-4091-b7b7-80b2901e8795-tmp\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.498944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498913 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chg75\" (UniqueName: \"kubernetes.io/projected/f39c93ca-0e47-4091-b7b7-80b2901e8795-kube-api-access-chg75\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.498960 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499001 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-serving-cert\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499058 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f39c93ca-0e47-4091-b7b7-80b2901e8795-snapshots\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.499074 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499098 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39c93ca-0e47-4091-b7b7-80b2901e8795-serving-cert\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.499160 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls podName:f3ec85ed-0f08-4e69-b8f5-19f031ceea01 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:13.999142834 +0000 UTC m=+33.641057698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lfc8g" (UID: "f3ec85ed-0f08-4e69-b8f5-19f031ceea01") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499239 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f39c93ca-0e47-4091-b7b7-80b2901e8795-tmp\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499269 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19bb6c72-6566-4c92-b004-41d6f12a658e-config\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499296 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5c2t\" (UniqueName: \"kubernetes.io/projected/154d44c3-fd83-4c64-a18a-acbfd5167f6f-kube-api-access-t5c2t\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499325 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39c93ca-0e47-4091-b7b7-80b2901e8795-service-ca-bundle\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.499353 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499346 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c356cb-0fee-47ee-a119-26d729d14274-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.499875 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499365 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-trusted-ca\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.499875 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499381 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx88k\" (UniqueName: \"kubernetes.io/projected/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-kube-api-access-hx88k\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:13.499875 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499801 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.499875 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499846 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-installation-pull-secrets\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.500072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499887 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-stats-auth\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.500072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499926 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-image-registry-private-configuration\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.500072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.499961 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-default-certificate\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.500072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500012 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspw7\" (UniqueName: \"kubernetes.io/projected/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-kube-api-access-jspw7\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.500072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500044 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/154d44c3-fd83-4c64-a18a-acbfd5167f6f-config-volume\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.500072 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500051 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f39c93ca-0e47-4091-b7b7-80b2901e8795-snapshots\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.500342 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500074 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/154d44c3-fd83-4c64-a18a-acbfd5167f6f-tmp-dir\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.500342 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500096 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39c93ca-0e47-4091-b7b7-80b2901e8795-service-ca-bundle\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.500342 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500108 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.500342 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500148 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpcm\" (UniqueName: \"kubernetes.io/projected/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-kube-api-access-8qpcm\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.500342 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500196 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9h2\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-kube-api-access-pp9h2\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.500342 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.500237 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19bb6c72-6566-4c92-b004-41d6f12a658e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.501780 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.501637 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39c93ca-0e47-4091-b7b7-80b2901e8795-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.504995 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.503403 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-trusted-ca\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.505680 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.505511 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39c93ca-0e47-4091-b7b7-80b2901e8795-serving-cert\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.506234 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.506211 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.506329 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.506243 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-image-registry-private-configuration\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.507264 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.507238 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-installation-pull-secrets\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.508166 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.508145 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-bound-sa-token\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.508683 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.508665 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9h2\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-kube-api-access-pp9h2\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:13.508740 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.508683 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpcm\" (UniqueName: \"kubernetes.io/projected/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-kube-api-access-8qpcm\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:13.508815 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.508796 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chg75\" (UniqueName: \"kubernetes.io/projected/f39c93ca-0e47-4091-b7b7-80b2901e8795-kube-api-access-chg75\") pod \"insights-operator-585dfdc468-4x6ss\" (UID: \"f39c93ca-0e47-4091-b7b7-80b2901e8795\") " pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.600951 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.600916 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7t4x\" (UniqueName: \"kubernetes.io/projected/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-kube-api-access-f7t4x\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.600951 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.600951 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-config\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.601133 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.600969 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:13.601133 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.601044 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:54:13.601133 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601042 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:13.601133 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.601091 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls podName:f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.101079121 +0000 UTC m=+33.742993972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tq8pm" (UID: "f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1") : secret "samples-operator-tls" not found Apr 24 23:54:13.601325 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601133 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-serving-cert\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.601325 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601192 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19bb6c72-6566-4c92-b004-41d6f12a658e-config\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.601325 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601213 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5c2t\" (UniqueName: \"kubernetes.io/projected/154d44c3-fd83-4c64-a18a-acbfd5167f6f-kube-api-access-t5c2t\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.601325 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601233 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c356cb-0fee-47ee-a119-26d729d14274-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.601325 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601260 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzmmd\" (UniqueName: \"kubernetes.io/projected/10b4da2b-ad8f-4af5-9e0b-28885ad2debc-kube-api-access-xzmmd\") pod \"network-check-source-8894fc9bd-rfr4p\" (UID: \"10b4da2b-ad8f-4af5-9e0b-28885ad2debc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" Apr 24 23:54:13.601325 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601287 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx88k\" (UniqueName: \"kubernetes.io/projected/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-kube-api-access-hx88k\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601433 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601496 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601525 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-stats-auth\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601555 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-default-certificate\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601597 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jspw7\" (UniqueName: \"kubernetes.io/projected/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-kube-api-access-jspw7\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.601623 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.101604032 +0000 UTC m=+33.743518900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : configmap references non-existent config key: service-ca.crt Apr 24 23:54:13.601644 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601630 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-config\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601658 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/154d44c3-fd83-4c64-a18a-acbfd5167f6f-config-volume\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601688 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/154d44c3-fd83-4c64-a18a-acbfd5167f6f-tmp-dir\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601717 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601729 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19bb6c72-6566-4c92-b004-41d6f12a658e-config\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601749 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5s2t\" (UniqueName: \"kubernetes.io/projected/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-kube-api-access-f5s2t\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601781 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19bb6c72-6566-4c92-b004-41d6f12a658e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601811 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t9nv\" (UniqueName: \"kubernetes.io/projected/19bb6c72-6566-4c92-b004-41d6f12a658e-kube-api-access-2t9nv\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601855 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrbt\" (UniqueName: \"kubernetes.io/projected/c1d5758c-6883-4cf2-be1b-364659aa4379-kube-api-access-6wrbt\") pod \"volume-data-source-validator-7c6cbb6c87-nk5lm\" (UID: \"c1d5758c-6883-4cf2-be1b-364659aa4379\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.601863 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601887 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c356cb-0fee-47ee-a119-26d729d14274-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601911 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpcx\" (UniqueName: \"kubernetes.io/projected/f8c356cb-0fee-47ee-a119-26d729d14274-kube-api-access-nnpcx\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.601945 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls podName:154d44c3-fd83-4c64-a18a-acbfd5167f6f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.101925311 +0000 UTC m=+33.743840169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls") pod "dns-default-cmhdt" (UID: "154d44c3-fd83-4c64-a18a-acbfd5167f6f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:13.601976 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.601980 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.602016 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c356cb-0fee-47ee-a119-26d729d14274-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.602048 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-trusted-ca\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.602075 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.602174 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.602194 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/154d44c3-fd83-4c64-a18a-acbfd5167f6f-tmp-dir\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.602216 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.102203323 +0000 UTC m=+33.744118179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : secret "router-metrics-certs-default" not found Apr 24 23:54:13.602681 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.602225 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/154d44c3-fd83-4c64-a18a-acbfd5167f6f-config-volume\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.603292 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.603268 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-trusted-ca\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.603856 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.603840 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-serving-cert\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.604178 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.604151 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-default-certificate\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.604284 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.604184 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" Apr 24 23:54:13.604284 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.604182 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-stats-auth\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.604284 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.604277 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c356cb-0fee-47ee-a119-26d729d14274-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.607211 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.607194 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19bb6c72-6566-4c92-b004-41d6f12a658e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.610725 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.610397 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5c2t\" (UniqueName: \"kubernetes.io/projected/154d44c3-fd83-4c64-a18a-acbfd5167f6f-kube-api-access-t5c2t\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:13.610725 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.610478 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7t4x\" (UniqueName: \"kubernetes.io/projected/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-kube-api-access-f7t4x\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:13.610996 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.610940 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspw7\" (UniqueName: \"kubernetes.io/projected/85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5-kube-api-access-jspw7\") pod \"console-operator-9d4b6777b-l4ktn\" (UID: \"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5\") " pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.611301 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.611255 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t9nv\" (UniqueName: \"kubernetes.io/projected/19bb6c72-6566-4c92-b004-41d6f12a658e-kube-api-access-2t9nv\") pod \"service-ca-operator-d6fc45fc5-hkhsn\" (UID: \"19bb6c72-6566-4c92-b004-41d6f12a658e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.611449 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.611425 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx88k\" (UniqueName: \"kubernetes.io/projected/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-kube-api-access-hx88k\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:13.611548 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.611531 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrbt\" (UniqueName: \"kubernetes.io/projected/c1d5758c-6883-4cf2-be1b-364659aa4379-kube-api-access-6wrbt\") pod \"volume-data-source-validator-7c6cbb6c87-nk5lm\" (UID: \"c1d5758c-6883-4cf2-be1b-364659aa4379\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" Apr 24 23:54:13.612627 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.612609 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpcx\" (UniqueName: \"kubernetes.io/projected/f8c356cb-0fee-47ee-a119-26d729d14274-kube-api-access-nnpcx\") pod \"kube-storage-version-migrator-operator-6769c5d45-b9kkx\" (UID: \"f8c356cb-0fee-47ee-a119-26d729d14274\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.637475 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.637437 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" Apr 24 23:54:13.690981 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.690954 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:13.703078 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.703004 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:13.703078 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.703061 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:13.703204 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.703100 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzmmd\" (UniqueName: \"kubernetes.io/projected/10b4da2b-ad8f-4af5-9e0b-28885ad2debc-kube-api-access-xzmmd\") pod \"network-check-source-8894fc9bd-rfr4p\" (UID: \"10b4da2b-ad8f-4af5-9e0b-28885ad2debc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" Apr 24 23:54:13.703204 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.703120 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:13.703204 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.703170 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5s2t\" (UniqueName: \"kubernetes.io/projected/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-kube-api-access-f5s2t\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:13.703581 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.703553 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:13.703726 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.703621 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert podName:a805551d-fa54-4c4d-a5d2-b5057e7eb7a9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.203603827 +0000 UTC m=+33.845518678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert") pod "ingress-canary-tdws9" (UID: "a805551d-fa54-4c4d-a5d2-b5057e7eb7a9") : secret "canary-serving-cert" not found Apr 24 23:54:13.703726 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.703627 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:13.703726 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.703639 2559 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:13.703726 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:13.703722 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert podName:8bd67c57-900f-4e8c-bc50-b1e0a7960a53 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:14.203706166 +0000 UTC m=+33.845621017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pwflc" (UID: "8bd67c57-900f-4e8c-bc50-b1e0a7960a53") : secret "networking-console-plugin-cert" not found Apr 24 23:54:13.705254 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.704933 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" Apr 24 23:54:13.712833 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.712791 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5s2t\" (UniqueName: \"kubernetes.io/projected/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-kube-api-access-f5s2t\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:13.713100 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.713078 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzmmd\" (UniqueName: \"kubernetes.io/projected/10b4da2b-ad8f-4af5-9e0b-28885ad2debc-kube-api-access-xzmmd\") pod \"network-check-source-8894fc9bd-rfr4p\" (UID: \"10b4da2b-ad8f-4af5-9e0b-28885ad2debc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" Apr 24 23:54:13.720784 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.720760 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" Apr 24 23:54:13.739916 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.739799 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4x6ss"] Apr 24 23:54:13.746733 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:13.745370 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf39c93ca_0e47_4091_b7b7_80b2901e8795.slice/crio-dd89ee8c4ca463d068ac13f3139d0cb97e3a3de9c5faf6a1dda595dce33507e1 WatchSource:0}: Error finding container dd89ee8c4ca463d068ac13f3139d0cb97e3a3de9c5faf6a1dda595dce33507e1: Status 404 returned error can't find the container with id dd89ee8c4ca463d068ac13f3139d0cb97e3a3de9c5faf6a1dda595dce33507e1 Apr 24 23:54:13.746935 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.746794 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" Apr 24 23:54:13.918213 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.918182 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm"] Apr 24 23:54:13.922272 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:13.922249 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1d5758c_6883_4cf2_be1b_364659aa4379.slice/crio-e902c63e58c7ce06e81bd429e5e1b657ebc6a2a7e90b11c8312289067c835ed3 WatchSource:0}: Error finding container e902c63e58c7ce06e81bd429e5e1b657ebc6a2a7e90b11c8312289067c835ed3: Status 404 returned error can't find the container with id e902c63e58c7ce06e81bd429e5e1b657ebc6a2a7e90b11c8312289067c835ed3 Apr 24 23:54:13.929132 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.929107 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx"] Apr 24 23:54:13.932195 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.932175 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-l4ktn"] Apr 24 23:54:13.937093 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:13.937059 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85cf1b2a_d5c1_4cb4_8250_bf11078d6bf5.slice/crio-8566e350497ce7d77ca0fe7795cff761c69317778e14aef55ea23fce962eb31a WatchSource:0}: Error finding container 8566e350497ce7d77ca0fe7795cff761c69317778e14aef55ea23fce962eb31a: Status 404 returned error can't find the container with id 8566e350497ce7d77ca0fe7795cff761c69317778e14aef55ea23fce962eb31a Apr 24 23:54:13.949725 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.949703 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p"] Apr 24 23:54:13.952600 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:13.952572 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b4da2b_ad8f_4af5_9e0b_28885ad2debc.slice/crio-9936bf92c064f3cbfe638c6876b94bb2302542cb8853d30abc2813a36e947ca3 WatchSource:0}: Error finding container 9936bf92c064f3cbfe638c6876b94bb2302542cb8853d30abc2813a36e947ca3: Status 404 returned error can't find the container with id 9936bf92c064f3cbfe638c6876b94bb2302542cb8853d30abc2813a36e947ca3 Apr 24 23:54:13.956198 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:13.956182 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn"] Apr 24 23:54:13.958573 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:13.958548 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19bb6c72_6566_4c92_b004_41d6f12a658e.slice/crio-2da5fbedb8db89602a1f969b5b2890903604b4e8ea0cd8367afe15ea15d58349 WatchSource:0}: Error finding container 2da5fbedb8db89602a1f969b5b2890903604b4e8ea0cd8367afe15ea15d58349: Status 404 returned error can't find the container with id 2da5fbedb8db89602a1f969b5b2890903604b4e8ea0cd8367afe15ea15d58349 Apr 24 23:54:14.006101 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.006074 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:14.006229 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.006212 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:14.006289 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.006268 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls podName:f3ec85ed-0f08-4e69-b8f5-19f031ceea01 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.00625285 +0000 UTC m=+34.648167702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lfc8g" (UID: "f3ec85ed-0f08-4e69-b8f5-19f031ceea01") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:14.006345 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.006316 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:14.006443 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.006422 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:14.006514 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.006448 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5797fc655d-8wv55: secret "image-registry-tls" not found Apr 24 23:54:14.006566 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.006524 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls podName:707e7bff-7937-422a-9cfa-268de45b3dd6 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.006506854 +0000 UTC m=+34.648421709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls") pod "image-registry-5797fc655d-8wv55" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6") : secret "image-registry-tls" not found Apr 24 23:54:14.101992 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.101955 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" event={"ID":"10b4da2b-ad8f-4af5-9e0b-28885ad2debc","Type":"ContainerStarted","Data":"9936bf92c064f3cbfe638c6876b94bb2302542cb8853d30abc2813a36e947ca3"} Apr 24 23:54:14.102981 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.102958 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" event={"ID":"19bb6c72-6566-4c92-b004-41d6f12a658e","Type":"ContainerStarted","Data":"2da5fbedb8db89602a1f969b5b2890903604b4e8ea0cd8367afe15ea15d58349"} Apr 24 23:54:14.103977 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.103955 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" event={"ID":"f8c356cb-0fee-47ee-a119-26d729d14274","Type":"ContainerStarted","Data":"510d165ee8dcbba505cb846a5942f827aeb00e9e8be3b9dc8ddce37b8c4eb248"} Apr 24 23:54:14.104966 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.104936 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" event={"ID":"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5","Type":"ContainerStarted","Data":"8566e350497ce7d77ca0fe7795cff761c69317778e14aef55ea23fce962eb31a"} Apr 24 23:54:14.105779 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.105752 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" event={"ID":"c1d5758c-6883-4cf2-be1b-364659aa4379","Type":"ContainerStarted","Data":"e902c63e58c7ce06e81bd429e5e1b657ebc6a2a7e90b11c8312289067c835ed3"} Apr 24 23:54:14.106758 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.106735 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:14.106856 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.106794 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:14.106910 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.106871 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:14.106964 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.106914 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.106892506 +0000 UTC m=+34.748807360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : configmap references non-existent config key: service-ca.crt Apr 24 23:54:14.106964 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.106944 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:14.106964 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.106956 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:54:14.107130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.106947 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:14.107130 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.106999 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls podName:154d44c3-fd83-4c64-a18a-acbfd5167f6f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.106982882 +0000 UTC m=+34.748897736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls") pod "dns-default-cmhdt" (UID: "154d44c3-fd83-4c64-a18a-acbfd5167f6f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:14.107130 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.107009 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:54:14.107130 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.107024 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.107013787 +0000 UTC m=+34.748928646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : secret "router-metrics-certs-default" not found Apr 24 23:54:14.107130 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.107044 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls podName:f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.107034124 +0000 UTC m=+34.748948978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tq8pm" (UID: "f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1") : secret "samples-operator-tls" not found Apr 24 23:54:14.108489 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.108445 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerStarted","Data":"6286500cd743a65a0df6e1bb5b377149317009cb036b6233789bc27b9492ff23"} Apr 24 23:54:14.109517 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.109494 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" event={"ID":"f39c93ca-0e47-4091-b7b7-80b2901e8795","Type":"ContainerStarted","Data":"dd89ee8c4ca463d068ac13f3139d0cb97e3a3de9c5faf6a1dda595dce33507e1"} Apr 24 23:54:14.207961 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.207884 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:14.208094 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.208060 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:14.208156 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.208065 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:14.208220 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.208188 2559 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:14.208280 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.208203 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert podName:a805551d-fa54-4c4d-a5d2-b5057e7eb7a9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.208177622 +0000 UTC m=+34.850092487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert") pod "ingress-canary-tdws9" (UID: "a805551d-fa54-4c4d-a5d2-b5057e7eb7a9") : secret "canary-serving-cert" not found Apr 24 23:54:14.208343 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.208287 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert podName:8bd67c57-900f-4e8c-bc50-b1e0a7960a53 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.208270186 +0000 UTC m=+34.850185038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pwflc" (UID: "8bd67c57-900f-4e8c-bc50-b1e0a7960a53") : secret "networking-console-plugin-cert" not found Apr 24 23:54:14.612924 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.612839 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:14.613156 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.613001 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:14.613156 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.613072 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs podName:f8df9612-54a5-4673-b2cc-33d7768fe61c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.613053149 +0000 UTC m=+66.254968005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs") pod "network-metrics-daemon-2s8sw" (UID: "f8df9612-54a5-4673-b2cc-33d7768fe61c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:14.714336 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.714296 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:14.714638 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.714621 2559 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:14.714722 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:14.714689 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret podName:67b28161-03e9-4905-8e32-8b7353db6c58 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.714671557 +0000 UTC m=+66.356586413 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret") pod "global-pull-secret-syncer-hb4xm" (UID: "67b28161-03e9-4905-8e32-8b7353db6c58") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:14.816293 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.815951 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:14.837219 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.837182 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skk2h\" (UniqueName: \"kubernetes.io/projected/800b9072-49a3-4275-947c-a73644d8448e-kube-api-access-skk2h\") pod \"network-check-target-jz5tk\" (UID: \"800b9072-49a3-4275-947c-a73644d8448e\") " pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:14.934405 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.932606 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:14.934405 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.933571 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:14.934405 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.934136 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:14.936604 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.936002 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8lkgq\"" Apr 24 23:54:14.936604 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.936248 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:14.936604 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.936420 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w9ssg\"" Apr 24 23:54:14.939232 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.936997 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:14.952392 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:14.952058 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:15.018183 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.018150 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:15.018385 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.018254 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:15.018485 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.018449 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:15.018546 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.018530 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls podName:f3ec85ed-0f08-4e69-b8f5-19f031ceea01 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.018511587 +0000 UTC m=+36.660426444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lfc8g" (UID: "f3ec85ed-0f08-4e69-b8f5-19f031ceea01") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:15.019061 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.018944 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:15.019061 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.018965 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5797fc655d-8wv55: secret "image-registry-tls" not found Apr 24 23:54:15.019061 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.019014 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls podName:707e7bff-7937-422a-9cfa-268de45b3dd6 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.018997409 +0000 UTC m=+36.660912275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls") pod "image-registry-5797fc655d-8wv55" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6") : secret "image-registry-tls" not found Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.118929 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.119047 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.119083 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.119198 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.119354 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.119335991 +0000 UTC m=+36.761250849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : configmap references non-existent config key: service-ca.crt Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.119772 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.119817 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls podName:154d44c3-fd83-4c64-a18a-acbfd5167f6f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.119803074 +0000 UTC m=+36.761717927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls") pod "dns-default-cmhdt" (UID: "154d44c3-fd83-4c64-a18a-acbfd5167f6f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.120176 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.120213 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.120200603 +0000 UTC m=+36.762115458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : secret "router-metrics-certs-default" not found Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.120271 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:54:15.120622 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.120302 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls podName:f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.120292167 +0000 UTC m=+36.762207018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tq8pm" (UID: "f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1") : secret "samples-operator-tls" not found Apr 24 23:54:15.123194 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.122336 2559 generic.go:358] "Generic (PLEG): container finished" podID="aa3ab10f-a4b0-49f5-8458-86e3138f3237" containerID="6286500cd743a65a0df6e1bb5b377149317009cb036b6233789bc27b9492ff23" exitCode=0 Apr 24 23:54:15.123194 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.122410 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerDied","Data":"6286500cd743a65a0df6e1bb5b377149317009cb036b6233789bc27b9492ff23"} Apr 24 23:54:15.128667 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.126779 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jz5tk"] Apr 24 23:54:15.128892 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:15.128858 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800b9072_49a3_4275_947c_a73644d8448e.slice/crio-eb53f79ae511ac41be23d1d7bf52d48e18dbe2c7284c14076e8fcb6bd2c72a2b WatchSource:0}: Error finding container eb53f79ae511ac41be23d1d7bf52d48e18dbe2c7284c14076e8fcb6bd2c72a2b: Status 404 returned error can't find the container with id eb53f79ae511ac41be23d1d7bf52d48e18dbe2c7284c14076e8fcb6bd2c72a2b Apr 24 23:54:15.220662 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.220585 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:15.220810 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:15.220787 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:15.222584 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.221193 2559 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:15.222584 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.221254 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert podName:8bd67c57-900f-4e8c-bc50-b1e0a7960a53 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.221236626 +0000 UTC m=+36.863151480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pwflc" (UID: "8bd67c57-900f-4e8c-bc50-b1e0a7960a53") : secret "networking-console-plugin-cert" not found Apr 24 23:54:15.222584 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.222134 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:15.222584 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:15.222242 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert podName:a805551d-fa54-4c4d-a5d2-b5057e7eb7a9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.222197021 +0000 UTC m=+36.864111876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert") pod "ingress-canary-tdws9" (UID: "a805551d-fa54-4c4d-a5d2-b5057e7eb7a9") : secret "canary-serving-cert" not found Apr 24 23:54:16.132577 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:16.131596 2559 generic.go:358] "Generic (PLEG): container finished" podID="aa3ab10f-a4b0-49f5-8458-86e3138f3237" containerID="8c32bf114587e689714780dee53ad56fa627073c17012f96d1733a9066d27042" exitCode=0 Apr 24 23:54:16.132577 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:16.131675 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerDied","Data":"8c32bf114587e689714780dee53ad56fa627073c17012f96d1733a9066d27042"} Apr 24 23:54:16.136552 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:16.136479 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jz5tk" event={"ID":"800b9072-49a3-4275-947c-a73644d8448e","Type":"ContainerStarted","Data":"eb53f79ae511ac41be23d1d7bf52d48e18dbe2c7284c14076e8fcb6bd2c72a2b"} Apr 24 23:54:17.039559 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.039532 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:17.039683 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.039615 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:17.039683 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.039646 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:17.039683 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.039663 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5797fc655d-8wv55: secret "image-registry-tls" not found Apr 24 23:54:17.039975 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.039720 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls podName:707e7bff-7937-422a-9cfa-268de45b3dd6 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.03970689 +0000 UTC m=+40.681621740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls") pod "image-registry-5797fc655d-8wv55" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6") : secret "image-registry-tls" not found Apr 24 23:54:17.039975 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.039765 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:17.039975 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.039822 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls podName:f3ec85ed-0f08-4e69-b8f5-19f031ceea01 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.039809939 +0000 UTC m=+40.681724792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lfc8g" (UID: "f3ec85ed-0f08-4e69-b8f5-19f031ceea01") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:17.140633 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.140604 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:17.141057 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.140662 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:17.141057 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.140734 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:17.141057 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.140772 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:17.141057 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.140923 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:54:17.141057 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.140981 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls podName:f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.140964404 +0000 UTC m=+40.782879260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tq8pm" (UID: "f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1") : secret "samples-operator-tls" not found Apr 24 23:54:17.141294 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.141233 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.141212643 +0000 UTC m=+40.783127509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : configmap references non-existent config key: service-ca.crt Apr 24 23:54:17.141294 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.141266 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:54:17.141388 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.141324 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.141307091 +0000 UTC m=+40.783221948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : secret "router-metrics-certs-default" not found Apr 24 23:54:17.141388 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.141274 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:17.141388 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.141376 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls podName:154d44c3-fd83-4c64-a18a-acbfd5167f6f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.141367089 +0000 UTC m=+40.783281940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls") pod "dns-default-cmhdt" (UID: "154d44c3-fd83-4c64-a18a-acbfd5167f6f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:17.241846 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.241796 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:17.242011 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.241964 2559 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:17.242082 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.242050 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert podName:8bd67c57-900f-4e8c-bc50-b1e0a7960a53 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.242029011 +0000 UTC m=+40.883943877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pwflc" (UID: "8bd67c57-900f-4e8c-bc50-b1e0a7960a53") : secret "networking-console-plugin-cert" not found Apr 24 23:54:17.242143 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:17.242130 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:17.242289 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.242270 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:17.242346 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:17.242340 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert podName:a805551d-fa54-4c4d-a5d2-b5057e7eb7a9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.242325849 +0000 UTC m=+40.884240707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert") pod "ingress-canary-tdws9" (UID: "a805551d-fa54-4c4d-a5d2-b5057e7eb7a9") : secret "canary-serving-cert" not found Apr 24 23:54:21.076944 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.076908 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:21.077320 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.076997 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:21.077320 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.077066 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:21.077320 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.077088 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5797fc655d-8wv55: secret "image-registry-tls" not found Apr 24 23:54:21.077320 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.077112 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:21.077320 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.077145 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls podName:707e7bff-7937-422a-9cfa-268de45b3dd6 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.077126046 +0000 UTC m=+48.719040917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls") pod "image-registry-5797fc655d-8wv55" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6") : secret "image-registry-tls" not found Apr 24 23:54:21.077320 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.077165 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls podName:f3ec85ed-0f08-4e69-b8f5-19f031ceea01 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.077155016 +0000 UTC m=+48.719069868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lfc8g" (UID: "f3ec85ed-0f08-4e69-b8f5-19f031ceea01") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:21.149332 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.149262 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" event={"ID":"10b4da2b-ad8f-4af5-9e0b-28885ad2debc","Type":"ContainerStarted","Data":"854abdef193e953e2c73fa649dcc7d29e8929e453a686d10626e8943a82de771"} Apr 24 23:54:21.150759 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.150734 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" event={"ID":"19bb6c72-6566-4c92-b004-41d6f12a658e","Type":"ContainerStarted","Data":"9f3a96694e5d16c3ab12da7ef2b23ac6f725ca7ea48540f8fcb480140d86b56a"} Apr 24 23:54:21.152637 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.152227 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" event={"ID":"f8c356cb-0fee-47ee-a119-26d729d14274","Type":"ContainerStarted","Data":"bd2250133a8451ea8a9d1a481f2f04c9e90ffbb03fdf3b1a37b0963c89f544f5"} Apr 24 23:54:21.154085 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.154066 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/0.log" Apr 24 23:54:21.154173 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.154104 2559 generic.go:358] "Generic (PLEG): container finished" podID="85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5" containerID="e2962a0d592ac50016caeaac0dd02dbc7426501c9b76e9ad3bb53189b3adab21" exitCode=255 Apr 24 23:54:21.154246 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.154170 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" event={"ID":"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5","Type":"ContainerDied","Data":"e2962a0d592ac50016caeaac0dd02dbc7426501c9b76e9ad3bb53189b3adab21"} Apr 24 23:54:21.154391 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.154370 2559 scope.go:117] "RemoveContainer" containerID="e2962a0d592ac50016caeaac0dd02dbc7426501c9b76e9ad3bb53189b3adab21" Apr 24 23:54:21.155612 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.155572 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" event={"ID":"c1d5758c-6883-4cf2-be1b-364659aa4379","Type":"ContainerStarted","Data":"049d9500d583dc1a1cbde8f2f31b2e22d52a02ecbe8898b63a8a6cd5c45e6c62"} Apr 24 23:54:21.158843 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.158810 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-shvmr" event={"ID":"aa3ab10f-a4b0-49f5-8458-86e3138f3237","Type":"ContainerStarted","Data":"2ac4a652f47f6e86b1c445cfe5652365138ddfb6e60b3e70710b8afdf5de99c6"} Apr 24 23:54:21.160135 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.160095 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" event={"ID":"f39c93ca-0e47-4091-b7b7-80b2901e8795","Type":"ContainerStarted","Data":"aa92fb55788f3bdd53b8d6712c7e9b8df712dffbb7fa9fd0d99ec8b0a0ee8406"} Apr 24 23:54:21.161540 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.161515 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jz5tk" event={"ID":"800b9072-49a3-4275-947c-a73644d8448e","Type":"ContainerStarted","Data":"6ed342efc197d508f4a273aa852998c38723b8eecc42db60bd87d3aa12051e00"} Apr 24 23:54:21.161702 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.161690 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:21.169225 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.169183 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rfr4p" podStartSLOduration=32.328287054 podStartE2EDuration="39.169168119s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:13.954309679 +0000 UTC m=+33.596224531" lastFinishedPulling="2026-04-24 23:54:20.795190741 +0000 UTC m=+40.437105596" observedRunningTime="2026-04-24 23:54:21.16743327 +0000 UTC m=+40.809348144" watchObservedRunningTime="2026-04-24 23:54:21.169168119 +0000 UTC m=+40.811082998" Apr 24 23:54:21.177580 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.177556 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:21.177840 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.177815 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:21.177924 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.177908 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:21.179322 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.178126 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:54:21.179493 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.178499 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.178478247 +0000 UTC m=+48.820393117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : configmap references non-existent config key: service-ca.crt Apr 24 23:54:21.179599 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.179146 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:21.179736 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.179659 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls podName:f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.179639883 +0000 UTC m=+48.821554746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tq8pm" (UID: "f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1") : secret "samples-operator-tls" not found Apr 24 23:54:21.179824 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.179767 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls podName:154d44c3-fd83-4c64-a18a-acbfd5167f6f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.17975071 +0000 UTC m=+48.821665565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls") pod "dns-default-cmhdt" (UID: "154d44c3-fd83-4c64-a18a-acbfd5167f6f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:21.179962 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.179933 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:21.180980 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.180960 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:54:21.181058 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.181009 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.180994084 +0000 UTC m=+48.822908940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : secret "router-metrics-certs-default" not found Apr 24 23:54:21.209195 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.209156 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jz5tk" podStartSLOduration=34.54333207 podStartE2EDuration="40.209144679s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:54:15.137554791 +0000 UTC m=+34.779469643" lastFinishedPulling="2026-04-24 23:54:20.803367402 +0000 UTC m=+40.445282252" observedRunningTime="2026-04-24 23:54:21.189128983 +0000 UTC m=+40.831043872" watchObservedRunningTime="2026-04-24 23:54:21.209144679 +0000 UTC m=+40.851059552" Apr 24 23:54:21.209847 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.209814 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk5lm" podStartSLOduration=33.34460961 podStartE2EDuration="40.209807039s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:54:13.924119959 +0000 UTC m=+33.566034810" lastFinishedPulling="2026-04-24 23:54:20.789317373 +0000 UTC m=+40.431232239" observedRunningTime="2026-04-24 23:54:21.208446017 +0000 UTC m=+40.850360890" watchObservedRunningTime="2026-04-24 23:54:21.209807039 +0000 UTC m=+40.851721912" Apr 24 23:54:21.232936 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.232903 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-shvmr" podStartSLOduration=9.948348224 podStartE2EDuration="40.23289172s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:53:43.450050603 +0000 UTC m=+3.091965470" lastFinishedPulling="2026-04-24 23:54:13.734594112 +0000 UTC m=+33.376508966" observedRunningTime="2026-04-24 23:54:21.232335865 +0000 UTC m=+40.874250738" watchObservedRunningTime="2026-04-24 23:54:21.23289172 +0000 UTC m=+40.874806593" Apr 24 23:54:21.280495 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.280417 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:21.280613 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.280589 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:21.280712 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.280697 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:21.280777 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.280747 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert podName:a805551d-fa54-4c4d-a5d2-b5057e7eb7a9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.280731948 +0000 UTC m=+48.922646802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert") pod "ingress-canary-tdws9" (UID: "a805551d-fa54-4c4d-a5d2-b5057e7eb7a9") : secret "canary-serving-cert" not found Apr 24 23:54:21.280836 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.280804 2559 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:21.280836 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:21.280835 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert podName:8bd67c57-900f-4e8c-bc50-b1e0a7960a53 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:29.280825229 +0000 UTC m=+48.922740084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pwflc" (UID: "8bd67c57-900f-4e8c-bc50-b1e0a7960a53") : secret "networking-console-plugin-cert" not found Apr 24 23:54:21.281031 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.280987 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" podStartSLOduration=32.423953626 podStartE2EDuration="39.280974343s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:13.93816814 +0000 UTC m=+33.580083006" lastFinishedPulling="2026-04-24 23:54:20.795188865 +0000 UTC m=+40.437103723" observedRunningTime="2026-04-24 23:54:21.279421519 +0000 UTC m=+40.921336391" watchObservedRunningTime="2026-04-24 23:54:21.280974343 +0000 UTC m=+40.922889227" Apr 24 23:54:21.300158 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.299252 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" podStartSLOduration=32.465306544 podStartE2EDuration="39.299236676s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:13.960206093 +0000 UTC m=+33.602120948" lastFinishedPulling="2026-04-24 23:54:20.794136216 +0000 UTC m=+40.436051080" observedRunningTime="2026-04-24 23:54:21.298692826 +0000 UTC m=+40.940607701" watchObservedRunningTime="2026-04-24 23:54:21.299236676 +0000 UTC m=+40.941151551" Apr 24 23:54:21.351542 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:21.351488 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" podStartSLOduration=33.30920218 podStartE2EDuration="40.351453972s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:54:13.749341088 +0000 UTC m=+33.391255952" lastFinishedPulling="2026-04-24 23:54:20.791592889 +0000 UTC m=+40.433507744" observedRunningTime="2026-04-24 23:54:21.34990546 +0000 UTC m=+40.991820333" watchObservedRunningTime="2026-04-24 23:54:21.351453972 +0000 UTC m=+40.993368847" Apr 24 23:54:22.166623 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:22.166595 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 24 23:54:22.167014 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:22.166948 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/0.log" Apr 24 23:54:22.167014 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:22.166979 2559 generic.go:358] "Generic (PLEG): container finished" podID="85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5" containerID="335a64101cc3c4cf18a8d02ad722b6a64924a206db78516498e35d8aa8e2de03" exitCode=255 Apr 24 23:54:22.167158 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:22.167121 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" event={"ID":"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5","Type":"ContainerDied","Data":"335a64101cc3c4cf18a8d02ad722b6a64924a206db78516498e35d8aa8e2de03"} Apr 24 23:54:22.167292 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:22.167191 2559 scope.go:117] "RemoveContainer" containerID="e2962a0d592ac50016caeaac0dd02dbc7426501c9b76e9ad3bb53189b3adab21" Apr 24 23:54:22.167363 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:22.167344 2559 scope.go:117] "RemoveContainer" containerID="335a64101cc3c4cf18a8d02ad722b6a64924a206db78516498e35d8aa8e2de03" Apr 24 23:54:22.167717 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:22.167638 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4ktn_openshift-console-operator(85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" podUID="85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5" Apr 24 23:54:23.171912 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:23.171886 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 24 23:54:23.172259 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:23.172218 2559 scope.go:117] "RemoveContainer" containerID="335a64101cc3c4cf18a8d02ad722b6a64924a206db78516498e35d8aa8e2de03" Apr 24 23:54:23.172383 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:23.172366 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4ktn_openshift-console-operator(85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" podUID="85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5" Apr 24 23:54:23.691493 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:23.691449 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:23.691648 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:23.691508 2559 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:23.783841 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:23.783816 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wkhjm_3189ee75-b94d-4dc4-a4f0-5805c80f852c/dns-node-resolver/0.log" Apr 24 23:54:24.174776 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:24.174751 2559 scope.go:117] "RemoveContainer" containerID="335a64101cc3c4cf18a8d02ad722b6a64924a206db78516498e35d8aa8e2de03" Apr 24 23:54:24.175105 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:24.174923 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-l4ktn_openshift-console-operator(85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" podUID="85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5" Apr 24 23:54:24.381014 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:24.380992 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4glfj_991de149-fd35-4947-8e6c-35dfa11c084c/node-ca/0.log" Apr 24 23:54:29.147891 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.147856 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:29.148318 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.147930 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:29.148318 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.148016 2559 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:29.148318 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.148035 2559 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5797fc655d-8wv55: secret "image-registry-tls" not found Apr 24 23:54:29.148318 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.148036 2559 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:29.148318 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.148100 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls podName:707e7bff-7937-422a-9cfa-268de45b3dd6 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.148084761 +0000 UTC m=+64.789999612 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls") pod "image-registry-5797fc655d-8wv55" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6") : secret "image-registry-tls" not found Apr 24 23:54:29.148318 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.148114 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls podName:f3ec85ed-0f08-4e69-b8f5-19f031ceea01 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.14810838 +0000 UTC m=+64.790023231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lfc8g" (UID: "f3ec85ed-0f08-4e69-b8f5-19f031ceea01") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:54:29.249216 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.249187 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:29.249346 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.249234 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:29.249381 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.249350 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:29.249427 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249374 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.249357593 +0000 UTC m=+64.891272444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : configmap references non-existent config key: service-ca.crt Apr 24 23:54:29.249427 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249413 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:54:29.249542 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.249426 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:29.249542 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249448 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs podName:a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.249437452 +0000 UTC m=+64.891352303 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs") pod "router-default-84bfffbb-vgvrl" (UID: "a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c") : secret "router-metrics-certs-default" not found Apr 24 23:54:29.249542 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249455 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:29.249542 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249508 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 23:54:29.249542 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249514 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls podName:154d44c3-fd83-4c64-a18a-acbfd5167f6f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.249503196 +0000 UTC m=+64.891418052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls") pod "dns-default-cmhdt" (UID: "154d44c3-fd83-4c64-a18a-acbfd5167f6f") : secret "dns-default-metrics-tls" not found Apr 24 23:54:29.249722 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.249550 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls podName:f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.249538746 +0000 UTC m=+64.891453601 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tq8pm" (UID: "f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1") : secret "samples-operator-tls" not found Apr 24 23:54:29.350794 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.350762 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:29.350932 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:29.350818 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:29.350932 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.350890 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:29.351013 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.350942 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert podName:a805551d-fa54-4c4d-a5d2-b5057e7eb7a9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.350930245 +0000 UTC m=+64.992845095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert") pod "ingress-canary-tdws9" (UID: "a805551d-fa54-4c4d-a5d2-b5057e7eb7a9") : secret "canary-serving-cert" not found Apr 24 23:54:29.351013 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.350956 2559 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:29.351013 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:54:29.350994 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert podName:8bd67c57-900f-4e8c-bc50-b1e0a7960a53 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.350984504 +0000 UTC m=+64.992899360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pwflc" (UID: "8bd67c57-900f-4e8c-bc50-b1e0a7960a53") : secret "networking-console-plugin-cert" not found Apr 24 23:54:38.098356 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:38.098327 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfwmd" Apr 24 23:54:38.932175 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:38.932146 2559 scope.go:117] "RemoveContainer" containerID="335a64101cc3c4cf18a8d02ad722b6a64924a206db78516498e35d8aa8e2de03" Apr 24 23:54:39.213714 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:39.213636 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 24 23:54:39.214104 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:39.213731 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" event={"ID":"85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5","Type":"ContainerStarted","Data":"9121952cc895628d627e62a83c61b0045c7248c77a77311a063187d45a1f4fc0"} Apr 24 23:54:39.214104 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:39.213998 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:39.540438 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:39.540370 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" Apr 24 23:54:39.559926 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:39.559873 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-l4ktn" podStartSLOduration=50.704805905 podStartE2EDuration="57.559858506s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:13.939509762 +0000 UTC m=+33.581424612" lastFinishedPulling="2026-04-24 23:54:20.794562361 +0000 UTC m=+40.436477213" observedRunningTime="2026-04-24 23:54:39.232558849 +0000 UTC m=+58.874473747" watchObservedRunningTime="2026-04-24 23:54:39.559858506 +0000 UTC m=+59.201773378" Apr 24 23:54:45.191799 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.191765 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:45.192149 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.191872 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:45.194308 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.194274 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3ec85ed-0f08-4e69-b8f5-19f031ceea01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lfc8g\" (UID: \"f3ec85ed-0f08-4e69-b8f5-19f031ceea01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:45.194418 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.194339 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"image-registry-5797fc655d-8wv55\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:45.292520 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.292486 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:45.292712 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.292535 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:45.292712 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.292575 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:45.292712 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.292595 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:45.293144 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.293113 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-service-ca-bundle\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:45.294903 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.294868 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/154d44c3-fd83-4c64-a18a-acbfd5167f6f-metrics-tls\") pod \"dns-default-cmhdt\" (UID: \"154d44c3-fd83-4c64-a18a-acbfd5167f6f\") " pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:45.295049 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.295029 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tq8pm\" (UID: \"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:45.295105 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.295058 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c-metrics-certs\") pod \"router-default-84bfffbb-vgvrl\" (UID: \"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c\") " pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:45.379955 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.379924 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68gn9\"" Apr 24 23:54:45.388167 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.388147 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:45.393098 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.393072 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:45.393175 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.393140 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:45.395349 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.395326 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a805551d-fa54-4c4d-a5d2-b5057e7eb7a9-cert\") pod \"ingress-canary-tdws9\" (UID: \"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9\") " pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:45.395426 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.395366 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8bd67c57-900f-4e8c-bc50-b1e0a7960a53-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pwflc\" (UID: \"8bd67c57-900f-4e8c-bc50-b1e0a7960a53\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:45.428109 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.428082 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dcbd2\"" Apr 24 23:54:45.435505 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.435479 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" Apr 24 23:54:45.456454 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.456219 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hxxq5\"" Apr 24 23:54:45.463594 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.463572 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:45.466336 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.466303 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xmfk4\"" Apr 24 23:54:45.474802 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.474718 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:45.483206 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.483151 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-w5gns\"" Apr 24 23:54:45.492432 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.490923 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" Apr 24 23:54:45.536684 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.534328 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5797fc655d-8wv55"] Apr 24 23:54:45.537437 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:45.536883 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707e7bff_7937_422a_9cfa_268de45b3dd6.slice/crio-a0b348f0c7045535390d384493365afceeb0e8552206b9abfb4688650bd927f8 WatchSource:0}: Error finding container a0b348f0c7045535390d384493365afceeb0e8552206b9abfb4688650bd927f8: Status 404 returned error can't find the container with id a0b348f0c7045535390d384493365afceeb0e8552206b9abfb4688650bd927f8 Apr 24 23:54:45.537437 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.537167 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z44xn\"" Apr 24 23:54:45.539566 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.539493 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" Apr 24 23:54:45.563682 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.563445 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mn69n\"" Apr 24 23:54:45.572802 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.571403 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tdws9" Apr 24 23:54:45.607669 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.607425 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g"] Apr 24 23:54:45.646405 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.646349 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84bfffbb-vgvrl"] Apr 24 23:54:45.659109 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.659057 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cmhdt"] Apr 24 23:54:45.663118 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:45.663077 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda95f5cf2_cd4a_474e_b34e_1b3165d0eb8c.slice/crio-f4760534f6a4a43ff4f80b4542fa89164f3b1c19947d5f6b6db3864347fcaf26 WatchSource:0}: Error finding container f4760534f6a4a43ff4f80b4542fa89164f3b1c19947d5f6b6db3864347fcaf26: Status 404 returned error can't find the container with id f4760534f6a4a43ff4f80b4542fa89164f3b1c19947d5f6b6db3864347fcaf26 Apr 24 23:54:45.665031 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:45.665002 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod154d44c3_fd83_4c64_a18a_acbfd5167f6f.slice/crio-d086e45a8bba89047984e9d3d6d06cad8215b63a95fb679dd9b6c34fa5a74f4a WatchSource:0}: Error finding container d086e45a8bba89047984e9d3d6d06cad8215b63a95fb679dd9b6c34fa5a74f4a: Status 404 returned error can't find the container with id d086e45a8bba89047984e9d3d6d06cad8215b63a95fb679dd9b6c34fa5a74f4a Apr 24 23:54:45.682091 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.682036 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm"] Apr 24 23:54:45.744130 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.744103 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pwflc"] Apr 24 23:54:45.747427 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:45.747404 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd67c57_900f_4e8c_bc50_b1e0a7960a53.slice/crio-d789233b0e45e14bd4918e4b8640e058d067feb9a7b6c86460c082626e96d5df WatchSource:0}: Error finding container d789233b0e45e14bd4918e4b8640e058d067feb9a7b6c86460c082626e96d5df: Status 404 returned error can't find the container with id d789233b0e45e14bd4918e4b8640e058d067feb9a7b6c86460c082626e96d5df Apr 24 23:54:45.779482 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:45.779444 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tdws9"] Apr 24 23:54:45.790214 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:45.790186 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda805551d_fa54_4c4d_a5d2_b5057e7eb7a9.slice/crio-93f58914b67492743bfe79938c927f94b4f88578f75acb1826317e93f1926849 WatchSource:0}: Error finding container 93f58914b67492743bfe79938c927f94b4f88578f75acb1826317e93f1926849: Status 404 returned error can't find the container with id 93f58914b67492743bfe79938c927f94b4f88578f75acb1826317e93f1926849 Apr 24 23:54:46.237067 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.236990 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" event={"ID":"707e7bff-7937-422a-9cfa-268de45b3dd6","Type":"ContainerStarted","Data":"571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478"} Apr 24 23:54:46.237067 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.237039 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" event={"ID":"707e7bff-7937-422a-9cfa-268de45b3dd6","Type":"ContainerStarted","Data":"a0b348f0c7045535390d384493365afceeb0e8552206b9abfb4688650bd927f8"} Apr 24 23:54:46.237593 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.237275 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:54:46.238879 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.238850 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" event={"ID":"8bd67c57-900f-4e8c-bc50-b1e0a7960a53","Type":"ContainerStarted","Data":"d789233b0e45e14bd4918e4b8640e058d067feb9a7b6c86460c082626e96d5df"} Apr 24 23:54:46.240409 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.240381 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" event={"ID":"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1","Type":"ContainerStarted","Data":"61a944731b4a58849e3e38b3a1094d0ec1fde0eb60de7f57aa27ac6fece134ca"} Apr 24 23:54:46.242609 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.242578 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cmhdt" event={"ID":"154d44c3-fd83-4c64-a18a-acbfd5167f6f","Type":"ContainerStarted","Data":"d086e45a8bba89047984e9d3d6d06cad8215b63a95fb679dd9b6c34fa5a74f4a"} Apr 24 23:54:46.244316 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.244289 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tdws9" event={"ID":"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9","Type":"ContainerStarted","Data":"93f58914b67492743bfe79938c927f94b4f88578f75acb1826317e93f1926849"} Apr 24 23:54:46.245745 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.245722 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" event={"ID":"f3ec85ed-0f08-4e69-b8f5-19f031ceea01","Type":"ContainerStarted","Data":"252c687076e9079316cca8f869d7f80e81d378f054bc03e904ab5f36ac3edbf0"} Apr 24 23:54:46.247857 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.247779 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84bfffbb-vgvrl" event={"ID":"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c","Type":"ContainerStarted","Data":"16986709e73a5040bb34b1e0b541d76609c81f5dde82eab3dd0183374ca38841"} Apr 24 23:54:46.247857 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.247810 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84bfffbb-vgvrl" event={"ID":"a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c","Type":"ContainerStarted","Data":"f4760534f6a4a43ff4f80b4542fa89164f3b1c19947d5f6b6db3864347fcaf26"} Apr 24 23:54:46.260160 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.259913 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" podStartSLOduration=65.259897645 podStartE2EDuration="1m5.259897645s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:46.258679481 +0000 UTC m=+65.900594356" watchObservedRunningTime="2026-04-24 23:54:46.259897645 +0000 UTC m=+65.901812521" Apr 24 23:54:46.278851 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.278799 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-84bfffbb-vgvrl" podStartSLOduration=65.278781913 podStartE2EDuration="1m5.278781913s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:46.277481465 +0000 UTC m=+65.919396333" watchObservedRunningTime="2026-04-24 23:54:46.278781913 +0000 UTC m=+65.920696787" Apr 24 23:54:46.464830 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.464595 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:46.468101 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.467880 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:46.710630 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.710170 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:46.713048 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.712999 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:46.726568 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.726495 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8df9612-54a5-4673-b2cc-33d7768fe61c-metrics-certs\") pod \"network-metrics-daemon-2s8sw\" (UID: \"f8df9612-54a5-4673-b2cc-33d7768fe61c\") " pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:46.779619 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.779391 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w9ssg\"" Apr 24 23:54:46.787264 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.787241 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2s8sw" Apr 24 23:54:46.811222 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.810939 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:46.814065 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.813877 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:46.826888 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:46.826862 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67b28161-03e9-4905-8e32-8b7353db6c58-original-pull-secret\") pod \"global-pull-secret-syncer-hb4xm\" (UID: \"67b28161-03e9-4905-8e32-8b7353db6c58\") " pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:47.088577 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:47.088396 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hb4xm" Apr 24 23:54:47.250949 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:47.250897 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:47.252622 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:47.252600 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-84bfffbb-vgvrl" Apr 24 23:54:49.922591 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.922563 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm"] Apr 24 23:54:49.940288 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.940267 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:49.944720 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.944702 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:54:49.944826 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.944807 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:54:49.944994 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.944975 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 23:54:49.945743 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.945726 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 23:54:49.945849 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.945833 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:54:49.945892 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.945840 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 23:54:49.946107 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.946090 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 23:54:49.952018 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:49.951999 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm"] Apr 24 23:54:50.037711 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.037678 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.037850 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.037726 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-hub\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.037850 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.037792 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-ca\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.037939 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.037871 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.037939 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.037912 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/944e2ba1-b748-4770-ac3c-b746589e18e5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.037939 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.037932 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hc6m\" (UniqueName: \"kubernetes.io/projected/944e2ba1-b748-4770-ac3c-b746589e18e5-kube-api-access-8hc6m\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.109418 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.109374 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kmmfk"] Apr 24 23:54:50.128374 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.128347 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kmmfk"] Apr 24 23:54:50.128552 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.128536 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.132923 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.132894 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:54:50.133028 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.132925 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:54:50.133028 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.132948 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gprwq\"" Apr 24 23:54:50.138563 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.138543 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.138668 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.138573 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-hub\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.138668 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.138606 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-ca\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.138668 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.138645 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.138805 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.138669 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/944e2ba1-b748-4770-ac3c-b746589e18e5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.138805 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.138691 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hc6m\" (UniqueName: \"kubernetes.io/projected/944e2ba1-b748-4770-ac3c-b746589e18e5-kube-api-access-8hc6m\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.139557 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.139530 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/944e2ba1-b748-4770-ac3c-b746589e18e5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.141190 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.141146 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-hub\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.141190 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.141146 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-ca\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.141337 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.141301 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.152393 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.152376 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/944e2ba1-b748-4770-ac3c-b746589e18e5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.166679 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.166659 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hc6m\" (UniqueName: \"kubernetes.io/projected/944e2ba1-b748-4770-ac3c-b746589e18e5-kube-api-access-8hc6m\") pod \"cluster-proxy-proxy-agent-6cd7fcbd-gw9rm\" (UID: \"944e2ba1-b748-4770-ac3c-b746589e18e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.239895 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.239873 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ng4\" (UniqueName: \"kubernetes.io/projected/6af5e39d-3c39-4d8c-b886-74bd88370c79-kube-api-access-x6ng4\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.239985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.239907 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6af5e39d-3c39-4d8c-b886-74bd88370c79-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.239985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.239940 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6af5e39d-3c39-4d8c-b886-74bd88370c79-crio-socket\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.239985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.239969 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6af5e39d-3c39-4d8c-b886-74bd88370c79-data-volume\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.240107 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.239994 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6af5e39d-3c39-4d8c-b886-74bd88370c79-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.261089 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.261064 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" Apr 24 23:54:50.341495 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.341432 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ng4\" (UniqueName: \"kubernetes.io/projected/6af5e39d-3c39-4d8c-b886-74bd88370c79-kube-api-access-x6ng4\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.341633 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.341497 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6af5e39d-3c39-4d8c-b886-74bd88370c79-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.341633 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.341550 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6af5e39d-3c39-4d8c-b886-74bd88370c79-crio-socket\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.341633 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.341586 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6af5e39d-3c39-4d8c-b886-74bd88370c79-data-volume\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.341633 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.341618 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6af5e39d-3c39-4d8c-b886-74bd88370c79-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.342973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.342838 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6af5e39d-3c39-4d8c-b886-74bd88370c79-crio-socket\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.344981 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.343218 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6af5e39d-3c39-4d8c-b886-74bd88370c79-data-volume\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.349018 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.346347 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6af5e39d-3c39-4d8c-b886-74bd88370c79-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.349018 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.346680 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6af5e39d-3c39-4d8c-b886-74bd88370c79-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.360610 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.359201 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ng4\" (UniqueName: \"kubernetes.io/projected/6af5e39d-3c39-4d8c-b886-74bd88370c79-kube-api-access-x6ng4\") pod \"insights-runtime-extractor-kmmfk\" (UID: \"6af5e39d-3c39-4d8c-b886-74bd88370c79\") " pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.437982 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.437617 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kmmfk" Apr 24 23:54:50.467511 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.465372 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hb4xm"] Apr 24 23:54:50.509419 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.491011 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2s8sw"] Apr 24 23:54:50.518747 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.518699 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm"] Apr 24 23:54:50.521712 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:50.521677 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8df9612_54a5_4673_b2cc_33d7768fe61c.slice/crio-ce32fcc6bb2cec337ffac9102a3e4d6d1cfed1bc519905347c6d856063fd9b5c WatchSource:0}: Error finding container ce32fcc6bb2cec337ffac9102a3e4d6d1cfed1bc519905347c6d856063fd9b5c: Status 404 returned error can't find the container with id ce32fcc6bb2cec337ffac9102a3e4d6d1cfed1bc519905347c6d856063fd9b5c Apr 24 23:54:50.652934 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:50.652904 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kmmfk"] Apr 24 23:54:50.658174 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:54:50.657938 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6af5e39d_3c39_4d8c_b886_74bd88370c79.slice/crio-20b0f68349b59cf5b4cac60344ff2aabc1ff8b408b5a24840d1be5ad1e89297e WatchSource:0}: Error finding container 20b0f68349b59cf5b4cac60344ff2aabc1ff8b408b5a24840d1be5ad1e89297e: Status 404 returned error can't find the container with id 20b0f68349b59cf5b4cac60344ff2aabc1ff8b408b5a24840d1be5ad1e89297e Apr 24 23:54:51.267490 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.267413 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2s8sw" event={"ID":"f8df9612-54a5-4673-b2cc-33d7768fe61c","Type":"ContainerStarted","Data":"ce32fcc6bb2cec337ffac9102a3e4d6d1cfed1bc519905347c6d856063fd9b5c"} Apr 24 23:54:51.268913 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.268854 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hb4xm" event={"ID":"67b28161-03e9-4905-8e32-8b7353db6c58","Type":"ContainerStarted","Data":"e232652cca1b040d9aa5c80de3e7c060ce2d7dbfdf44af89748abe7acc8b2c27"} Apr 24 23:54:51.270805 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.270782 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" event={"ID":"8bd67c57-900f-4e8c-bc50-b1e0a7960a53","Type":"ContainerStarted","Data":"32ff0e1e51f482a66b6610a2a35ffc16ca212e6fed7e4be7591c0d27db0103da"} Apr 24 23:54:51.273999 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.273578 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" event={"ID":"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1","Type":"ContainerStarted","Data":"6049793070f79bb3ba6796d66ad1db4913fb1a08faed9d8bd87a8e7d317c9630"} Apr 24 23:54:51.273999 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.273607 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" event={"ID":"f0ea1221-a323-4fa1-8f2b-9b4b3e3faff1","Type":"ContainerStarted","Data":"abac083878ef06d797fcce7caa2a2816a84feaae71760da4ce306a05602e32c8"} Apr 24 23:54:51.275700 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.275453 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tdws9" event={"ID":"a805551d-fa54-4c4d-a5d2-b5057e7eb7a9","Type":"ContainerStarted","Data":"141038a9a7de4f456225df0808271220997ff3d00f4a22149860c929945dcc74"} Apr 24 23:54:51.279305 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.279131 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" event={"ID":"f3ec85ed-0f08-4e69-b8f5-19f031ceea01","Type":"ContainerStarted","Data":"303931edb9169036b3cb7bf32bdc0c9bfc0cc64a019f4199cdc0f65138236656"} Apr 24 23:54:51.281271 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.281193 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmmfk" event={"ID":"6af5e39d-3c39-4d8c-b886-74bd88370c79","Type":"ContainerStarted","Data":"5fdbde23bf642cbaa8e4fa8cc89228e78b7f75c716fc7718dcde4372680ae3a7"} Apr 24 23:54:51.281271 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.281224 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmmfk" event={"ID":"6af5e39d-3c39-4d8c-b886-74bd88370c79","Type":"ContainerStarted","Data":"20b0f68349b59cf5b4cac60344ff2aabc1ff8b408b5a24840d1be5ad1e89297e"} Apr 24 23:54:51.283518 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.283449 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" event={"ID":"944e2ba1-b748-4770-ac3c-b746589e18e5","Type":"ContainerStarted","Data":"b81de65ac04edeab2b22f5f35c5320a830fcd5daf83d9add0337666e963bbe40"} Apr 24 23:54:51.288002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.287931 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cmhdt" event={"ID":"154d44c3-fd83-4c64-a18a-acbfd5167f6f","Type":"ContainerStarted","Data":"983afe7de34adce0163baacb89e6d195aacf3463f125f88dd00de858556e3a08"} Apr 24 23:54:51.288002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.287963 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cmhdt" event={"ID":"154d44c3-fd83-4c64-a18a-acbfd5167f6f","Type":"ContainerStarted","Data":"43a3836cec5b5420f9d6b4562a1f41863bc259429d4988e1016cca0ad7d93eda"} Apr 24 23:54:51.288002 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.287981 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cmhdt" Apr 24 23:54:51.312835 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.312556 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pwflc" podStartSLOduration=45.765609269 podStartE2EDuration="50.312535447s" podCreationTimestamp="2026-04-24 23:54:01 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.749500457 +0000 UTC m=+65.391415310" lastFinishedPulling="2026-04-24 23:54:50.296426636 +0000 UTC m=+69.938341488" observedRunningTime="2026-04-24 23:54:51.299682232 +0000 UTC m=+70.941597106" watchObservedRunningTime="2026-04-24 23:54:51.312535447 +0000 UTC m=+70.954450321" Apr 24 23:54:51.341037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.340858 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lfc8g" podStartSLOduration=65.660717828 podStartE2EDuration="1m10.340839261s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.615867561 +0000 UTC m=+65.257782425" lastFinishedPulling="2026-04-24 23:54:50.295989005 +0000 UTC m=+69.937903858" observedRunningTime="2026-04-24 23:54:51.338060405 +0000 UTC m=+70.979975282" watchObservedRunningTime="2026-04-24 23:54:51.340839261 +0000 UTC m=+70.982754137" Apr 24 23:54:51.367865 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.367482 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tq8pm" podStartSLOduration=64.8295176 podStartE2EDuration="1m9.367443838s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.76526418 +0000 UTC m=+65.407179030" lastFinishedPulling="2026-04-24 23:54:50.303190412 +0000 UTC m=+69.945105268" observedRunningTime="2026-04-24 23:54:51.366216897 +0000 UTC m=+71.008131771" watchObservedRunningTime="2026-04-24 23:54:51.367443838 +0000 UTC m=+71.009358714" Apr 24 23:54:51.429817 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.428675 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cmhdt" podStartSLOduration=33.801630949 podStartE2EDuration="38.428654905s" podCreationTimestamp="2026-04-24 23:54:13 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.667497572 +0000 UTC m=+65.309412438" lastFinishedPulling="2026-04-24 23:54:50.294521543 +0000 UTC m=+69.936436394" observedRunningTime="2026-04-24 23:54:51.39667502 +0000 UTC m=+71.038589895" watchObservedRunningTime="2026-04-24 23:54:51.428654905 +0000 UTC m=+71.070569779" Apr 24 23:54:51.429817 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:51.428957 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tdws9" podStartSLOduration=33.921649014 podStartE2EDuration="38.428953387s" podCreationTimestamp="2026-04-24 23:54:13 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.792293047 +0000 UTC m=+65.434207898" lastFinishedPulling="2026-04-24 23:54:50.299597419 +0000 UTC m=+69.941512271" observedRunningTime="2026-04-24 23:54:51.428605032 +0000 UTC m=+71.070519905" watchObservedRunningTime="2026-04-24 23:54:51.428953387 +0000 UTC m=+71.070868295" Apr 24 23:54:52.169966 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:52.169936 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jz5tk" Apr 24 23:54:55.302985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:55.302830 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" event={"ID":"944e2ba1-b748-4770-ac3c-b746589e18e5","Type":"ContainerStarted","Data":"1190268e9cded3a2da480017a978a3dac5ce351f7918e63eee4c6b93c695a587"} Apr 24 23:54:55.304641 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:55.304615 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmmfk" event={"ID":"6af5e39d-3c39-4d8c-b886-74bd88370c79","Type":"ContainerStarted","Data":"0cecf42074f1be7e6fff0a07e26fa906082212ea313b65715bd41d8cfeb996d8"} Apr 24 23:54:56.312474 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:56.312419 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2s8sw" event={"ID":"f8df9612-54a5-4673-b2cc-33d7768fe61c","Type":"ContainerStarted","Data":"b7c9d89c4fda115e5d35105f46015707361eff9970a02353710600e015fd0758"} Apr 24 23:54:56.312924 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:56.312483 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2s8sw" event={"ID":"f8df9612-54a5-4673-b2cc-33d7768fe61c","Type":"ContainerStarted","Data":"3dd8f62dc4ef8857e41c8b98af8cc373ce0ecebd2cbd2e8167a8fd1d7b6b915b"} Apr 24 23:54:56.314408 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:56.314366 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hb4xm" event={"ID":"67b28161-03e9-4905-8e32-8b7353db6c58","Type":"ContainerStarted","Data":"408be224212ec40329381fa2467a2b2c0702d0432732ad9d7929d6d7e73068e4"} Apr 24 23:54:56.328250 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:56.328121 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2s8sw" podStartSLOduration=70.701819664 podStartE2EDuration="1m15.328103804s" podCreationTimestamp="2026-04-24 23:53:41 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.536102781 +0000 UTC m=+70.178017641" lastFinishedPulling="2026-04-24 23:54:55.162386916 +0000 UTC m=+74.804301781" observedRunningTime="2026-04-24 23:54:56.327238787 +0000 UTC m=+75.969153662" watchObservedRunningTime="2026-04-24 23:54:56.328103804 +0000 UTC m=+75.970018677" Apr 24 23:54:57.319124 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:57.319084 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmmfk" event={"ID":"6af5e39d-3c39-4d8c-b886-74bd88370c79","Type":"ContainerStarted","Data":"f85ce41df8966fdab3e6e2c2e0c6d5873894725d1e1c0f9a6db3198a4e15d742"} Apr 24 23:54:57.320985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:57.320956 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" event={"ID":"944e2ba1-b748-4770-ac3c-b746589e18e5","Type":"ContainerStarted","Data":"b39a3a5071bb8311335745bb0e174aae25e06edb32dc2fa69b4865b30e642a84"} Apr 24 23:54:57.321128 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:57.320992 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" event={"ID":"944e2ba1-b748-4770-ac3c-b746589e18e5","Type":"ContainerStarted","Data":"d3fa27964bf9d1932040cbf81b9e9d3796f61360a4375767b89e87b6d6655f61"} Apr 24 23:54:57.336146 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:57.336098 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kmmfk" podStartSLOduration=1.048760474 podStartE2EDuration="7.336080494s" podCreationTimestamp="2026-04-24 23:54:50 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.832241689 +0000 UTC m=+70.474156541" lastFinishedPulling="2026-04-24 23:54:57.119561707 +0000 UTC m=+76.761476561" observedRunningTime="2026-04-24 23:54:57.335486029 +0000 UTC m=+76.977400904" watchObservedRunningTime="2026-04-24 23:54:57.336080494 +0000 UTC m=+76.977995367" Apr 24 23:54:57.337127 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:57.337088 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hb4xm" podStartSLOduration=70.628145585 podStartE2EDuration="1m15.337077913s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.474255191 +0000 UTC m=+70.116170072" lastFinishedPulling="2026-04-24 23:54:55.183187546 +0000 UTC m=+74.825102400" observedRunningTime="2026-04-24 23:54:56.340745793 +0000 UTC m=+75.982660701" watchObservedRunningTime="2026-04-24 23:54:57.337077913 +0000 UTC m=+76.978992789" Apr 24 23:54:57.354100 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:54:57.354046 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6cd7fcbd-gw9rm" podStartSLOduration=1.770010352 podStartE2EDuration="8.354032254s" podCreationTimestamp="2026-04-24 23:54:49 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.5397146 +0000 UTC m=+70.181629458" lastFinishedPulling="2026-04-24 23:54:57.123736497 +0000 UTC m=+76.765651360" observedRunningTime="2026-04-24 23:54:57.353611986 +0000 UTC m=+76.995526859" watchObservedRunningTime="2026-04-24 23:54:57.354032254 +0000 UTC m=+76.995947128" Apr 24 23:55:01.294907 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:01.294769 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cmhdt" Apr 24 23:55:03.446089 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.446054 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zwdbl"] Apr 24 23:55:03.450833 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.450807 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.453624 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.453596 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6rtvs\"" Apr 24 23:55:03.454163 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.454143 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:55:03.454265 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.454194 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:55:03.454411 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.454390 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:55:03.454559 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.454543 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:55:03.551545 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551510 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl54t\" (UniqueName: \"kubernetes.io/projected/26d781b2-1576-4d2f-acd6-2fe49496e995-kube-api-access-pl54t\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551545 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551550 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551575 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26d781b2-1576-4d2f-acd6-2fe49496e995-metrics-client-ca\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551590 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-sys\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551612 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-wtmp\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551645 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-tls\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551666 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-accelerators-collector-config\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551729 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-textfile\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.551773 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.551751 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-root\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653157 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653120 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl54t\" (UniqueName: \"kubernetes.io/projected/26d781b2-1576-4d2f-acd6-2fe49496e995-kube-api-access-pl54t\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653157 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653157 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653399 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653187 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26d781b2-1576-4d2f-acd6-2fe49496e995-metrics-client-ca\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653399 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653258 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-sys\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653399 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653302 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-wtmp\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653399 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653337 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-tls\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653399 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653352 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-sys\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653399 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653361 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-accelerators-collector-config\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653739 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653420 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-textfile\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653739 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653482 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-root\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653739 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:55:03.653503 2559 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:55:03.653739 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653535 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-wtmp\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653739 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:55:03.653564 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-tls podName:26d781b2-1576-4d2f-acd6-2fe49496e995 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:04.153542876 +0000 UTC m=+83.795457744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-tls") pod "node-exporter-zwdbl" (UID: "26d781b2-1576-4d2f-acd6-2fe49496e995") : secret "node-exporter-tls" not found Apr 24 23:55:03.653739 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653569 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/26d781b2-1576-4d2f-acd6-2fe49496e995-root\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653783 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-textfile\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653806 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26d781b2-1576-4d2f-acd6-2fe49496e995-metrics-client-ca\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.653985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.653879 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-accelerators-collector-config\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.655660 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.655637 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:03.662781 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:03.662761 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl54t\" (UniqueName: \"kubernetes.io/projected/26d781b2-1576-4d2f-acd6-2fe49496e995-kube-api-access-pl54t\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:04.156690 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:04.156645 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-tls\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:04.159212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:04.159187 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26d781b2-1576-4d2f-acd6-2fe49496e995-node-exporter-tls\") pod \"node-exporter-zwdbl\" (UID: \"26d781b2-1576-4d2f-acd6-2fe49496e995\") " pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:04.360490 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:04.360441 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zwdbl" Apr 24 23:55:04.369696 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:55:04.369663 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d781b2_1576_4d2f_acd6_2fe49496e995.slice/crio-a6e24e1a40402db1392bb3e30271257df0ee88124fdae6c0cca5b137ed9af9fb WatchSource:0}: Error finding container a6e24e1a40402db1392bb3e30271257df0ee88124fdae6c0cca5b137ed9af9fb: Status 404 returned error can't find the container with id a6e24e1a40402db1392bb3e30271257df0ee88124fdae6c0cca5b137ed9af9fb Apr 24 23:55:05.346898 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:05.346802 2559 generic.go:358] "Generic (PLEG): container finished" podID="26d781b2-1576-4d2f-acd6-2fe49496e995" containerID="53dc7b903db36d72f29d397d4980bd4b266dab8afe38203340572bb726cfe8fd" exitCode=0 Apr 24 23:55:05.346898 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:05.346879 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwdbl" event={"ID":"26d781b2-1576-4d2f-acd6-2fe49496e995","Type":"ContainerDied","Data":"53dc7b903db36d72f29d397d4980bd4b266dab8afe38203340572bb726cfe8fd"} Apr 24 23:55:05.347313 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:05.346923 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwdbl" event={"ID":"26d781b2-1576-4d2f-acd6-2fe49496e995","Type":"ContainerStarted","Data":"a6e24e1a40402db1392bb3e30271257df0ee88124fdae6c0cca5b137ed9af9fb"} Apr 24 23:55:05.393050 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:05.393018 2559 patch_prober.go:28] interesting pod/image-registry-5797fc655d-8wv55 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:55:05.393190 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:05.393078 2559 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" podUID="707e7bff-7937-422a-9cfa-268de45b3dd6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:55:06.351379 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:06.351345 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwdbl" event={"ID":"26d781b2-1576-4d2f-acd6-2fe49496e995","Type":"ContainerStarted","Data":"7acb114ca18cba0b850357835ffa5972af3dec0742e3915cee0af63cbb894cdf"} Apr 24 23:55:06.351805 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:06.351388 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwdbl" event={"ID":"26d781b2-1576-4d2f-acd6-2fe49496e995","Type":"ContainerStarted","Data":"681f23b38c7375921d6e7a2b2c7f970f1b904a3fff267bcdd6a1db4245e09e80"} Apr 24 23:55:06.372171 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:06.372124 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zwdbl" podStartSLOduration=2.7123502840000002 podStartE2EDuration="3.372107899s" podCreationTimestamp="2026-04-24 23:55:03 +0000 UTC" firstStartedPulling="2026-04-24 23:55:04.371421241 +0000 UTC m=+84.013336092" lastFinishedPulling="2026-04-24 23:55:05.031178844 +0000 UTC m=+84.673093707" observedRunningTime="2026-04-24 23:55:06.37009185 +0000 UTC m=+86.012006727" watchObservedRunningTime="2026-04-24 23:55:06.372107899 +0000 UTC m=+86.014022824" Apr 24 23:55:07.254954 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:07.254927 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:55:13.331365 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:13.331332 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5797fc655d-8wv55"] Apr 24 23:55:26.409906 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:26.409874 2559 generic.go:358] "Generic (PLEG): container finished" podID="f8c356cb-0fee-47ee-a119-26d729d14274" containerID="bd2250133a8451ea8a9d1a481f2f04c9e90ffbb03fdf3b1a37b0963c89f544f5" exitCode=0 Apr 24 23:55:26.410425 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:26.409936 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" event={"ID":"f8c356cb-0fee-47ee-a119-26d729d14274","Type":"ContainerDied","Data":"bd2250133a8451ea8a9d1a481f2f04c9e90ffbb03fdf3b1a37b0963c89f544f5"} Apr 24 23:55:26.410425 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:26.410318 2559 scope.go:117] "RemoveContainer" containerID="bd2250133a8451ea8a9d1a481f2f04c9e90ffbb03fdf3b1a37b0963c89f544f5" Apr 24 23:55:27.414255 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:27.414221 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b9kkx" event={"ID":"f8c356cb-0fee-47ee-a119-26d729d14274","Type":"ContainerStarted","Data":"17c14551e509053245c8890d1ef9399c3202ae5819dbb3caa070301f6f5e907a"} Apr 24 23:55:37.444654 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:37.444623 2559 generic.go:358] "Generic (PLEG): container finished" podID="f39c93ca-0e47-4091-b7b7-80b2901e8795" containerID="aa92fb55788f3bdd53b8d6712c7e9b8df712dffbb7fa9fd0d99ec8b0a0ee8406" exitCode=0 Apr 24 23:55:37.445135 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:37.444682 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" event={"ID":"f39c93ca-0e47-4091-b7b7-80b2901e8795","Type":"ContainerDied","Data":"aa92fb55788f3bdd53b8d6712c7e9b8df712dffbb7fa9fd0d99ec8b0a0ee8406"} Apr 24 23:55:37.445135 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:37.444987 2559 scope.go:117] "RemoveContainer" containerID="aa92fb55788f3bdd53b8d6712c7e9b8df712dffbb7fa9fd0d99ec8b0a0ee8406" Apr 24 23:55:38.353456 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.353382 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" podUID="707e7bff-7937-422a-9cfa-268de45b3dd6" containerName="registry" containerID="cri-o://571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478" gracePeriod=30 Apr 24 23:55:38.449046 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.449014 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4x6ss" event={"ID":"f39c93ca-0e47-4091-b7b7-80b2901e8795","Type":"ContainerStarted","Data":"63ee5a82ea4e507797b3b25735e4caf336f29ae916c4e3a88777ae56d8927043"} Apr 24 23:55:38.590828 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.590803 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:55:38.740038 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.739995 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-image-registry-private-configuration\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740076 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-bound-sa-token\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740097 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-installation-pull-secrets\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740135 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740159 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707e7bff-7937-422a-9cfa-268de45b3dd6-ca-trust-extracted\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740180 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp9h2\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-kube-api-access-pp9h2\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740212 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740210 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-trusted-ca\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740527 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740234 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-certificates\") pod \"707e7bff-7937-422a-9cfa-268de45b3dd6\" (UID: \"707e7bff-7937-422a-9cfa-268de45b3dd6\") " Apr 24 23:55:38.740854 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740787 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:38.740985 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.740879 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:38.742841 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.742794 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:38.742952 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.742856 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:38.742952 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.742888 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:38.743041 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.743005 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-kube-api-access-pp9h2" (OuterVolumeSpecName: "kube-api-access-pp9h2") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "kube-api-access-pp9h2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:38.743093 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.743065 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:38.748973 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.748944 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707e7bff-7937-422a-9cfa-268de45b3dd6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "707e7bff-7937-422a-9cfa-268de45b3dd6" (UID: "707e7bff-7937-422a-9cfa-268de45b3dd6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:55:38.841338 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841304 2559 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/707e7bff-7937-422a-9cfa-268de45b3dd6-ca-trust-extracted\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841338 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841338 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pp9h2\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-kube-api-access-pp9h2\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841350 2559 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-trusted-ca\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841360 2559 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-certificates\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841371 2559 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-image-registry-private-configuration\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841380 2559 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-bound-sa-token\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841389 2559 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/707e7bff-7937-422a-9cfa-268de45b3dd6-installation-pull-secrets\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:38.841554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:38.841397 2559 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/707e7bff-7937-422a-9cfa-268de45b3dd6-registry-tls\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 24 23:55:39.453077 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.453043 2559 generic.go:358] "Generic (PLEG): container finished" podID="707e7bff-7937-422a-9cfa-268de45b3dd6" containerID="571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478" exitCode=0 Apr 24 23:55:39.453077 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.453084 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" event={"ID":"707e7bff-7937-422a-9cfa-268de45b3dd6","Type":"ContainerDied","Data":"571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478"} Apr 24 23:55:39.453636 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.453106 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" event={"ID":"707e7bff-7937-422a-9cfa-268de45b3dd6","Type":"ContainerDied","Data":"a0b348f0c7045535390d384493365afceeb0e8552206b9abfb4688650bd927f8"} Apr 24 23:55:39.453636 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.453110 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5797fc655d-8wv55" Apr 24 23:55:39.453636 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.453126 2559 scope.go:117] "RemoveContainer" containerID="571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478" Apr 24 23:55:39.460905 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.460885 2559 scope.go:117] "RemoveContainer" containerID="571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478" Apr 24 23:55:39.461183 ip-10-0-129-4 kubenswrapper[2559]: E0424 23:55:39.461160 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478\": container with ID starting with 571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478 not found: ID does not exist" containerID="571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478" Apr 24 23:55:39.461245 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.461196 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478"} err="failed to get container status \"571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478\": rpc error: code = NotFound desc = could not find container \"571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478\": container with ID starting with 571ce31c75b79f91fd4d117fb2b792521d8e12be704b5d7f5e9476d8181e0478 not found: ID does not exist" Apr 24 23:55:39.470310 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.470283 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5797fc655d-8wv55"] Apr 24 23:55:39.474556 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:39.474535 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5797fc655d-8wv55"] Apr 24 23:55:40.936974 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:40.936942 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707e7bff-7937-422a-9cfa-268de45b3dd6" path="/var/lib/kubelet/pods/707e7bff-7937-422a-9cfa-268de45b3dd6/volumes" Apr 24 23:55:57.511704 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:57.511673 2559 generic.go:358] "Generic (PLEG): container finished" podID="19bb6c72-6566-4c92-b004-41d6f12a658e" containerID="9f3a96694e5d16c3ab12da7ef2b23ac6f725ca7ea48540f8fcb480140d86b56a" exitCode=0 Apr 24 23:55:57.512103 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:57.511743 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" event={"ID":"19bb6c72-6566-4c92-b004-41d6f12a658e","Type":"ContainerDied","Data":"9f3a96694e5d16c3ab12da7ef2b23ac6f725ca7ea48540f8fcb480140d86b56a"} Apr 24 23:55:57.512103 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:57.512075 2559 scope.go:117] "RemoveContainer" containerID="9f3a96694e5d16c3ab12da7ef2b23ac6f725ca7ea48540f8fcb480140d86b56a" Apr 24 23:55:58.515763 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:55:58.515724 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hkhsn" event={"ID":"19bb6c72-6566-4c92-b004-41d6f12a658e","Type":"ContainerStarted","Data":"297d20d106c2df4dc31275b7ae8f08626e8815791cfdc3cc7e1e9ed137c537ff"} Apr 24 23:58:40.858717 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:58:40.858685 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 24 23:58:40.859702 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:58:40.859681 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 24 23:58:40.865393 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:58:40.865373 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 24 23:58:40.866037 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:58:40.866015 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 24 23:59:00.716808 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.716772 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74"] Apr 24 23:59:00.719024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.717051 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="707e7bff-7937-422a-9cfa-268de45b3dd6" containerName="registry" Apr 24 23:59:00.719024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.717062 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="707e7bff-7937-422a-9cfa-268de45b3dd6" containerName="registry" Apr 24 23:59:00.719024 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.717123 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="707e7bff-7937-422a-9cfa-268de45b3dd6" containerName="registry" Apr 24 23:59:00.719837 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.719822 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:00.722308 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.722282 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-h4cf2\"" Apr 24 23:59:00.723210 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.723192 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:59:00.723320 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.723194 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 23:59:00.727199 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.727179 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74"] Apr 24 23:59:00.860524 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.860481 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3-tmp\") pod \"openshift-lws-operator-bfc7f696d-hkq74\" (UID: \"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:00.860716 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.860602 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klcr6\" (UniqueName: \"kubernetes.io/projected/6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3-kube-api-access-klcr6\") pod \"openshift-lws-operator-bfc7f696d-hkq74\" (UID: \"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:00.961734 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.961692 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3-tmp\") pod \"openshift-lws-operator-bfc7f696d-hkq74\" (UID: \"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:00.961931 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.961751 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klcr6\" (UniqueName: \"kubernetes.io/projected/6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3-kube-api-access-klcr6\") pod \"openshift-lws-operator-bfc7f696d-hkq74\" (UID: \"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:00.962126 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.962104 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3-tmp\") pod \"openshift-lws-operator-bfc7f696d-hkq74\" (UID: \"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:00.970240 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:00.970166 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klcr6\" (UniqueName: \"kubernetes.io/projected/6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3-kube-api-access-klcr6\") pod \"openshift-lws-operator-bfc7f696d-hkq74\" (UID: \"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:01.029447 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:01.029421 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" Apr 24 23:59:01.146792 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:01.146759 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74"] Apr 24 23:59:01.149757 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:59:01.149729 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cc8cade_b8aa_42e0_802b_bb56b9a2f9b3.slice/crio-88cac3d811c764bedde8bd5cbb7139bd5915a35e3ecfa7af5f467bf3887b99e3 WatchSource:0}: Error finding container 88cac3d811c764bedde8bd5cbb7139bd5915a35e3ecfa7af5f467bf3887b99e3: Status 404 returned error can't find the container with id 88cac3d811c764bedde8bd5cbb7139bd5915a35e3ecfa7af5f467bf3887b99e3 Apr 24 23:59:01.151256 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:01.151234 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:59:02.022959 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:02.022920 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" event={"ID":"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3","Type":"ContainerStarted","Data":"88cac3d811c764bedde8bd5cbb7139bd5915a35e3ecfa7af5f467bf3887b99e3"} Apr 24 23:59:04.030209 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:04.030174 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" event={"ID":"6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3","Type":"ContainerStarted","Data":"25a98c98631e3fb749f09e331c5626da694a8fa10d504c22977c53f93132d099"} Apr 24 23:59:04.047636 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:04.047589 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hkq74" podStartSLOduration=1.693494971 podStartE2EDuration="4.04757604s" podCreationTimestamp="2026-04-24 23:59:00 +0000 UTC" firstStartedPulling="2026-04-24 23:59:01.151421296 +0000 UTC m=+320.793336151" lastFinishedPulling="2026-04-24 23:59:03.505502366 +0000 UTC m=+323.147417220" observedRunningTime="2026-04-24 23:59:04.046168889 +0000 UTC m=+323.688083764" watchObservedRunningTime="2026-04-24 23:59:04.04757604 +0000 UTC m=+323.689490913" Apr 24 23:59:27.513186 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.513146 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-94ddh"] Apr 24 23:59:27.516624 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.516601 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.519207 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.519188 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 23:59:27.519377 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.519358 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-nd68z\"" Apr 24 23:59:27.519575 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.519560 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 23:59:27.528052 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.528032 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-94ddh"] Apr 24 23:59:27.563364 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.563330 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqfp\" (UniqueName: \"kubernetes.io/projected/6296d4d3-107c-49d9-8277-108c6d7678af-kube-api-access-5vqfp\") pod \"servicemesh-operator3-55f49c5f94-94ddh\" (UID: \"6296d4d3-107c-49d9-8277-108c6d7678af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.563544 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.563400 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6296d4d3-107c-49d9-8277-108c6d7678af-operator-config\") pod \"servicemesh-operator3-55f49c5f94-94ddh\" (UID: \"6296d4d3-107c-49d9-8277-108c6d7678af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.664689 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.664647 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6296d4d3-107c-49d9-8277-108c6d7678af-operator-config\") pod \"servicemesh-operator3-55f49c5f94-94ddh\" (UID: \"6296d4d3-107c-49d9-8277-108c6d7678af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.664858 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.664703 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqfp\" (UniqueName: \"kubernetes.io/projected/6296d4d3-107c-49d9-8277-108c6d7678af-kube-api-access-5vqfp\") pod \"servicemesh-operator3-55f49c5f94-94ddh\" (UID: \"6296d4d3-107c-49d9-8277-108c6d7678af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.667196 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.667177 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6296d4d3-107c-49d9-8277-108c6d7678af-operator-config\") pod \"servicemesh-operator3-55f49c5f94-94ddh\" (UID: \"6296d4d3-107c-49d9-8277-108c6d7678af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.673896 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.673875 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqfp\" (UniqueName: \"kubernetes.io/projected/6296d4d3-107c-49d9-8277-108c6d7678af-kube-api-access-5vqfp\") pod \"servicemesh-operator3-55f49c5f94-94ddh\" (UID: \"6296d4d3-107c-49d9-8277-108c6d7678af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.824952 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.824854 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:27.954392 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:27.954361 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-94ddh"] Apr 24 23:59:27.957960 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:59:27.957923 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6296d4d3_107c_49d9_8277_108c6d7678af.slice/crio-ab77a9dd35c7e1987ebe523a3d0ad7496dcf5e8b32ac622392518258e2b0fd60 WatchSource:0}: Error finding container ab77a9dd35c7e1987ebe523a3d0ad7496dcf5e8b32ac622392518258e2b0fd60: Status 404 returned error can't find the container with id ab77a9dd35c7e1987ebe523a3d0ad7496dcf5e8b32ac622392518258e2b0fd60 Apr 24 23:59:28.097058 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:28.096976 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" event={"ID":"6296d4d3-107c-49d9-8277-108c6d7678af","Type":"ContainerStarted","Data":"ab77a9dd35c7e1987ebe523a3d0ad7496dcf5e8b32ac622392518258e2b0fd60"} Apr 24 23:59:31.115027 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:31.114937 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" event={"ID":"6296d4d3-107c-49d9-8277-108c6d7678af","Type":"ContainerStarted","Data":"a86158e227f5a08b6bed4716687cc9fce64cd0ac4039323fa672a82de325ae2f"} Apr 24 23:59:31.115378 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:31.115075 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:31.135018 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:31.134968 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" podStartSLOduration=1.253581035 podStartE2EDuration="4.134952392s" podCreationTimestamp="2026-04-24 23:59:27 +0000 UTC" firstStartedPulling="2026-04-24 23:59:27.960383424 +0000 UTC m=+347.602298275" lastFinishedPulling="2026-04-24 23:59:30.841754767 +0000 UTC m=+350.483669632" observedRunningTime="2026-04-24 23:59:31.134532533 +0000 UTC m=+350.776447409" watchObservedRunningTime="2026-04-24 23:59:31.134952392 +0000 UTC m=+350.776867265" Apr 24 23:59:38.265323 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.265285 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8"] Apr 24 23:59:38.272554 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.272529 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.275009 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.274947 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-7tdx4\"" Apr 24 23:59:38.275173 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.275025 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 23:59:38.282806 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.282776 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8"] Apr 24 23:59:38.356375 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356341 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f9tf\" (UniqueName: \"kubernetes.io/projected/497c349c-f860-4bbe-9864-bb616471b5f7-kube-api-access-6f9tf\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356549 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356386 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356549 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356419 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356549 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356491 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/497c349c-f860-4bbe-9864-bb616471b5f7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356549 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356525 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356549 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356547 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/497c349c-f860-4bbe-9864-bb616471b5f7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356808 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356649 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/497c349c-f860-4bbe-9864-bb616471b5f7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356808 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356692 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.356808 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.356721 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457703 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457667 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/497c349c-f860-4bbe-9864-bb616471b5f7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457886 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457725 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/497c349c-f860-4bbe-9864-bb616471b5f7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457886 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457747 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457886 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457766 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457886 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457792 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9tf\" (UniqueName: \"kubernetes.io/projected/497c349c-f860-4bbe-9864-bb616471b5f7-kube-api-access-6f9tf\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457886 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457824 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.457886 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457867 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458251 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457909 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/497c349c-f860-4bbe-9864-bb616471b5f7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458251 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.457947 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458357 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.458311 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458357 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.458342 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458595 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.458574 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458673 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.458573 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.458784 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.458767 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/497c349c-f860-4bbe-9864-bb616471b5f7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.460290 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.460269 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/497c349c-f860-4bbe-9864-bb616471b5f7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.460647 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.460628 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/497c349c-f860-4bbe-9864-bb616471b5f7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.465383 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.465363 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9tf\" (UniqueName: \"kubernetes.io/projected/497c349c-f860-4bbe-9864-bb616471b5f7-kube-api-access-6f9tf\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.465502 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.465422 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/497c349c-f860-4bbe-9864-bb616471b5f7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rslp8\" (UID: \"497c349c-f860-4bbe-9864-bb616471b5f7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.585737 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.585654 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:38.711900 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:38.711778 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8"] Apr 24 23:59:38.715408 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:59:38.715369 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod497c349c_f860_4bbe_9864_bb616471b5f7.slice/crio-5beb9e3e5540b550b5eb5ad90390794f3a7e941a50e8a136c9bf8c76289f8bad WatchSource:0}: Error finding container 5beb9e3e5540b550b5eb5ad90390794f3a7e941a50e8a136c9bf8c76289f8bad: Status 404 returned error can't find the container with id 5beb9e3e5540b550b5eb5ad90390794f3a7e941a50e8a136c9bf8c76289f8bad Apr 24 23:59:39.138814 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:39.138779 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" event={"ID":"497c349c-f860-4bbe-9864-bb616471b5f7","Type":"ContainerStarted","Data":"5beb9e3e5540b550b5eb5ad90390794f3a7e941a50e8a136c9bf8c76289f8bad"} Apr 24 23:59:41.869443 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:41.869407 2559 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 23:59:41.869696 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:41.869501 2559 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 23:59:41.869696 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:41.869532 2559 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 23:59:42.027663 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.027590 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b"] Apr 24 23:59:42.031208 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.031191 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.034890 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.034871 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-fb2ft\"" Apr 24 23:59:42.035157 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.035137 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 23:59:42.035228 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.035199 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 23:59:42.035631 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.035615 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 23:59:42.042935 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.042909 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b"] Apr 24 23:59:42.086583 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.086551 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxwg\" (UniqueName: \"kubernetes.io/projected/16199b27-4899-4f66-8124-758e67b43ef3-kube-api-access-gwxwg\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.086766 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.086617 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/16199b27-4899-4f66-8124-758e67b43ef3-metrics-cert\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.086766 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.086644 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/16199b27-4899-4f66-8124-758e67b43ef3-manager-config\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.086766 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.086663 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16199b27-4899-4f66-8124-758e67b43ef3-cert\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.120706 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.120678 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-94ddh" Apr 24 23:59:42.152764 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.152726 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" event={"ID":"497c349c-f860-4bbe-9864-bb616471b5f7","Type":"ContainerStarted","Data":"4b40afd4ca3256acbe739ef7a9b726c534119066573d42207548971f8531d947"} Apr 24 23:59:42.176816 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.176725 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" podStartSLOduration=1.025324541 podStartE2EDuration="4.176706804s" podCreationTimestamp="2026-04-24 23:59:38 +0000 UTC" firstStartedPulling="2026-04-24 23:59:38.717775338 +0000 UTC m=+358.359690192" lastFinishedPulling="2026-04-24 23:59:41.869157601 +0000 UTC m=+361.511072455" observedRunningTime="2026-04-24 23:59:42.175927204 +0000 UTC m=+361.817842078" watchObservedRunningTime="2026-04-24 23:59:42.176706804 +0000 UTC m=+361.818621679" Apr 24 23:59:42.187565 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.187540 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxwg\" (UniqueName: \"kubernetes.io/projected/16199b27-4899-4f66-8124-758e67b43ef3-kube-api-access-gwxwg\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.187705 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.187631 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/16199b27-4899-4f66-8124-758e67b43ef3-metrics-cert\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.187705 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.187664 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/16199b27-4899-4f66-8124-758e67b43ef3-manager-config\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.187705 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.187696 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16199b27-4899-4f66-8124-758e67b43ef3-cert\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.188356 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.188333 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/16199b27-4899-4f66-8124-758e67b43ef3-manager-config\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.190306 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.190286 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16199b27-4899-4f66-8124-758e67b43ef3-cert\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.190396 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.190379 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/16199b27-4899-4f66-8124-758e67b43ef3-metrics-cert\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.197071 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.197049 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxwg\" (UniqueName: \"kubernetes.io/projected/16199b27-4899-4f66-8124-758e67b43ef3-kube-api-access-gwxwg\") pod \"lws-controller-manager-58fbc56fdc-wb76b\" (UID: \"16199b27-4899-4f66-8124-758e67b43ef3\") " pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.341403 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.341330 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:42.460761 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.460738 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b"] Apr 24 23:59:42.462755 ip-10-0-129-4 kubenswrapper[2559]: W0424 23:59:42.462727 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16199b27_4899_4f66_8124_758e67b43ef3.slice/crio-d33e0b8cdc9342d93b1648ce7201981893a7029a9d3bfc195ad70375a514273d WatchSource:0}: Error finding container d33e0b8cdc9342d93b1648ce7201981893a7029a9d3bfc195ad70375a514273d: Status 404 returned error can't find the container with id d33e0b8cdc9342d93b1648ce7201981893a7029a9d3bfc195ad70375a514273d Apr 24 23:59:42.586074 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.586046 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:42.590769 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:42.590747 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:43.156825 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:43.156788 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" event={"ID":"16199b27-4899-4f66-8124-758e67b43ef3","Type":"ContainerStarted","Data":"d33e0b8cdc9342d93b1648ce7201981893a7029a9d3bfc195ad70375a514273d"} Apr 24 23:59:43.157226 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:43.156967 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:43.157915 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:43.157879 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rslp8" Apr 24 23:59:45.164570 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:45.164531 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" event={"ID":"16199b27-4899-4f66-8124-758e67b43ef3","Type":"ContainerStarted","Data":"4399254b04bdd71f10f3285ca2e8c603642f309bedccad3beee0a1f3a9f3336b"} Apr 24 23:59:45.164966 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:45.164662 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 24 23:59:45.181599 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:45.181544 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" podStartSLOduration=2.43525014 podStartE2EDuration="4.181532371s" podCreationTimestamp="2026-04-24 23:59:41 +0000 UTC" firstStartedPulling="2026-04-24 23:59:42.464870126 +0000 UTC m=+362.106784977" lastFinishedPulling="2026-04-24 23:59:44.211152353 +0000 UTC m=+363.853067208" observedRunningTime="2026-04-24 23:59:45.180903255 +0000 UTC m=+364.822818130" watchObservedRunningTime="2026-04-24 23:59:45.181532371 +0000 UTC m=+364.823447243" Apr 24 23:59:56.169948 ip-10-0-129-4 kubenswrapper[2559]: I0424 23:59:56.169907 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-58fbc56fdc-wb76b" Apr 25 00:00:18.014201 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.014168 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd"] Apr 25 00:00:18.027199 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.027170 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd"] Apr 25 00:00:18.027340 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.027259 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.029848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.029824 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 25 00:00:18.029996 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.029871 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 25 00:00:18.029996 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.029935 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 25 00:00:18.030888 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.030856 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6768d\"" Apr 25 00:00:18.030888 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.030881 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 25 00:00:18.079617 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.079590 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.079745 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.079624 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/818481f9-8390-4653-a55c-4046045ba15b-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.079745 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.079702 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zk2g\" (UniqueName: \"kubernetes.io/projected/818481f9-8390-4653-a55c-4046045ba15b-kube-api-access-5zk2g\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.180720 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.180688 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.180884 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.180727 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/818481f9-8390-4653-a55c-4046045ba15b-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.180884 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.180769 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zk2g\" (UniqueName: \"kubernetes.io/projected/818481f9-8390-4653-a55c-4046045ba15b-kube-api-access-5zk2g\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.180884 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:00:18.180857 2559 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 25 00:00:18.181038 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:00:18.180938 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert podName:818481f9-8390-4653-a55c-4046045ba15b nodeName:}" failed. No retries permitted until 2026-04-25 00:00:18.680915281 +0000 UTC m=+398.322830135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-p8xhd" (UID: "818481f9-8390-4653-a55c-4046045ba15b") : secret "plugin-serving-cert" not found Apr 25 00:00:18.181528 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.181505 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/818481f9-8390-4653-a55c-4046045ba15b-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.201400 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.201373 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zk2g\" (UniqueName: \"kubernetes.io/projected/818481f9-8390-4653-a55c-4046045ba15b-kube-api-access-5zk2g\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.683846 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:18.683805 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:18.684025 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:00:18.683949 2559 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 25 00:00:18.684025 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:00:18.684016 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert podName:818481f9-8390-4653-a55c-4046045ba15b nodeName:}" failed. No retries permitted until 2026-04-25 00:00:19.683999862 +0000 UTC m=+399.325914713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-p8xhd" (UID: "818481f9-8390-4653-a55c-4046045ba15b") : secret "plugin-serving-cert" not found Apr 25 00:00:19.691624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:19.691593 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:19.694069 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:19.694046 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/818481f9-8390-4653-a55c-4046045ba15b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-p8xhd\" (UID: \"818481f9-8390-4653-a55c-4046045ba15b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:19.857359 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:19.857322 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" Apr 25 00:00:19.974313 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:19.974248 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd"] Apr 25 00:00:19.977196 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:00:19.977165 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod818481f9_8390_4653_a55c_4046045ba15b.slice/crio-aa8c35fb629faa2df55c61a992b79b103c72f327262f0956016feac96fd509db WatchSource:0}: Error finding container aa8c35fb629faa2df55c61a992b79b103c72f327262f0956016feac96fd509db: Status 404 returned error can't find the container with id aa8c35fb629faa2df55c61a992b79b103c72f327262f0956016feac96fd509db Apr 25 00:00:20.278771 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:20.278690 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" event={"ID":"818481f9-8390-4653-a55c-4046045ba15b","Type":"ContainerStarted","Data":"aa8c35fb629faa2df55c61a992b79b103c72f327262f0956016feac96fd509db"} Apr 25 00:00:25.298212 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:25.298185 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" event={"ID":"818481f9-8390-4653-a55c-4046045ba15b","Type":"ContainerStarted","Data":"8c0e2d3a9aec33b2a8a5c22394ff406c486b09e1f012a7e25623e84a908c6656"} Apr 25 00:00:26.317652 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:00:26.317601 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-p8xhd" podStartSLOduration=4.065437928 podStartE2EDuration="9.317586695s" podCreationTimestamp="2026-04-25 00:00:17 +0000 UTC" firstStartedPulling="2026-04-25 00:00:19.978535045 +0000 UTC m=+399.620449897" lastFinishedPulling="2026-04-25 00:00:25.23068381 +0000 UTC m=+404.872598664" observedRunningTime="2026-04-25 00:00:26.316333665 +0000 UTC m=+405.958248538" watchObservedRunningTime="2026-04-25 00:00:26.317586695 +0000 UTC m=+405.959501582" Apr 25 00:01:02.350436 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.350343 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-q5ff7"] Apr 25 00:01:02.354113 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.354089 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:02.356474 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.356436 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tt2fv\"" Apr 25 00:01:02.363308 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.363284 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-q5ff7"] Apr 25 00:01:02.432524 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.432491 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdkx\" (UniqueName: \"kubernetes.io/projected/a399c6d4-3938-49d5-a77b-655d22062102-kube-api-access-zwdkx\") pod \"authorino-674b59b84c-q5ff7\" (UID: \"a399c6d4-3938-49d5-a77b-655d22062102\") " pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:02.533938 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.533898 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdkx\" (UniqueName: \"kubernetes.io/projected/a399c6d4-3938-49d5-a77b-655d22062102-kube-api-access-zwdkx\") pod \"authorino-674b59b84c-q5ff7\" (UID: \"a399c6d4-3938-49d5-a77b-655d22062102\") " pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:02.542036 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.542008 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdkx\" (UniqueName: \"kubernetes.io/projected/a399c6d4-3938-49d5-a77b-655d22062102-kube-api-access-zwdkx\") pod \"authorino-674b59b84c-q5ff7\" (UID: \"a399c6d4-3938-49d5-a77b-655d22062102\") " pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:02.663992 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.663966 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:02.779479 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:02.779427 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-q5ff7"] Apr 25 00:01:02.782010 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:01:02.781981 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda399c6d4_3938_49d5_a77b_655d22062102.slice/crio-818673d686083d19d638c0b21718f1d4586323b72bbc359d787f1df545672128 WatchSource:0}: Error finding container 818673d686083d19d638c0b21718f1d4586323b72bbc359d787f1df545672128: Status 404 returned error can't find the container with id 818673d686083d19d638c0b21718f1d4586323b72bbc359d787f1df545672128 Apr 25 00:01:03.422067 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:03.422033 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-q5ff7" event={"ID":"a399c6d4-3938-49d5-a77b-655d22062102","Type":"ContainerStarted","Data":"818673d686083d19d638c0b21718f1d4586323b72bbc359d787f1df545672128"} Apr 25 00:01:06.435153 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:06.435118 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-q5ff7" event={"ID":"a399c6d4-3938-49d5-a77b-655d22062102","Type":"ContainerStarted","Data":"c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af"} Apr 25 00:01:06.449029 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:06.448983 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-q5ff7" podStartSLOduration=1.305134051 podStartE2EDuration="4.448971451s" podCreationTimestamp="2026-04-25 00:01:02 +0000 UTC" firstStartedPulling="2026-04-25 00:01:02.783250261 +0000 UTC m=+442.425165113" lastFinishedPulling="2026-04-25 00:01:05.927087647 +0000 UTC m=+445.569002513" observedRunningTime="2026-04-25 00:01:06.447976178 +0000 UTC m=+446.089891051" watchObservedRunningTime="2026-04-25 00:01:06.448971451 +0000 UTC m=+446.090886324" Apr 25 00:01:08.571986 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:08.571936 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-q5ff7"] Apr 25 00:01:08.572456 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:08.572129 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-q5ff7" podUID="a399c6d4-3938-49d5-a77b-655d22062102" containerName="authorino" containerID="cri-o://c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af" gracePeriod=30 Apr 25 00:01:08.805942 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:08.805914 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:08.892356 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:08.892275 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwdkx\" (UniqueName: \"kubernetes.io/projected/a399c6d4-3938-49d5-a77b-655d22062102-kube-api-access-zwdkx\") pod \"a399c6d4-3938-49d5-a77b-655d22062102\" (UID: \"a399c6d4-3938-49d5-a77b-655d22062102\") " Apr 25 00:01:08.894481 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:08.894427 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a399c6d4-3938-49d5-a77b-655d22062102-kube-api-access-zwdkx" (OuterVolumeSpecName: "kube-api-access-zwdkx") pod "a399c6d4-3938-49d5-a77b-655d22062102" (UID: "a399c6d4-3938-49d5-a77b-655d22062102"). InnerVolumeSpecName "kube-api-access-zwdkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:01:08.993029 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:08.992996 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwdkx\" (UniqueName: \"kubernetes.io/projected/a399c6d4-3938-49d5-a77b-655d22062102-kube-api-access-zwdkx\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:01:09.445999 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.445967 2559 generic.go:358] "Generic (PLEG): container finished" podID="a399c6d4-3938-49d5-a77b-655d22062102" containerID="c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af" exitCode=0 Apr 25 00:01:09.446173 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.446019 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-q5ff7" Apr 25 00:01:09.446173 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.446039 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-q5ff7" event={"ID":"a399c6d4-3938-49d5-a77b-655d22062102","Type":"ContainerDied","Data":"c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af"} Apr 25 00:01:09.446173 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.446067 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-q5ff7" event={"ID":"a399c6d4-3938-49d5-a77b-655d22062102","Type":"ContainerDied","Data":"818673d686083d19d638c0b21718f1d4586323b72bbc359d787f1df545672128"} Apr 25 00:01:09.446173 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.446080 2559 scope.go:117] "RemoveContainer" containerID="c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af" Apr 25 00:01:09.454179 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.454162 2559 scope.go:117] "RemoveContainer" containerID="c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af" Apr 25 00:01:09.454387 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:01:09.454368 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af\": container with ID starting with c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af not found: ID does not exist" containerID="c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af" Apr 25 00:01:09.454442 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.454394 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af"} err="failed to get container status \"c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af\": rpc error: code = NotFound desc = could not find container \"c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af\": container with ID starting with c1b5bf7a8de7162527a5030521cac94219482c7557809c2621dabde86b0d55af not found: ID does not exist" Apr 25 00:01:09.461192 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.461170 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-q5ff7"] Apr 25 00:01:09.463224 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:09.463197 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-q5ff7"] Apr 25 00:01:10.936138 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:10.936109 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a399c6d4-3938-49d5-a77b-655d22062102" path="/var/lib/kubelet/pods/a399c6d4-3938-49d5-a77b-655d22062102/volumes" Apr 25 00:01:27.489206 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.489172 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-c8wnm"] Apr 25 00:01:27.489599 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.489479 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a399c6d4-3938-49d5-a77b-655d22062102" containerName="authorino" Apr 25 00:01:27.489599 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.489492 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="a399c6d4-3938-49d5-a77b-655d22062102" containerName="authorino" Apr 25 00:01:27.489599 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.489547 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="a399c6d4-3938-49d5-a77b-655d22062102" containerName="authorino" Apr 25 00:01:27.493431 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.493414 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.496406 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.496387 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tt2fv\"" Apr 25 00:01:27.497344 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.497328 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 25 00:01:27.510417 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.510397 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-c8wnm"] Apr 25 00:01:27.546712 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.546689 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f-tls-cert\") pod \"authorino-68bd676465-c8wnm\" (UID: \"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f\") " pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.546816 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.546755 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdz6k\" (UniqueName: \"kubernetes.io/projected/51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f-kube-api-access-jdz6k\") pod \"authorino-68bd676465-c8wnm\" (UID: \"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f\") " pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.647784 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.647752 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdz6k\" (UniqueName: \"kubernetes.io/projected/51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f-kube-api-access-jdz6k\") pod \"authorino-68bd676465-c8wnm\" (UID: \"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f\") " pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.647941 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.647802 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f-tls-cert\") pod \"authorino-68bd676465-c8wnm\" (UID: \"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f\") " pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.650170 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.650146 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f-tls-cert\") pod \"authorino-68bd676465-c8wnm\" (UID: \"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f\") " pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.655823 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.655805 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdz6k\" (UniqueName: \"kubernetes.io/projected/51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f-kube-api-access-jdz6k\") pod \"authorino-68bd676465-c8wnm\" (UID: \"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f\") " pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.802299 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.802217 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-c8wnm" Apr 25 00:01:27.923078 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:27.922891 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-c8wnm"] Apr 25 00:01:27.925616 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:01:27.925588 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ba2f33_ec5c_41d9_9d98_eb9b6a3d805f.slice/crio-b43fad9712036fe45ab16d01c21399655625c7a70ffea8db00bcaa7f5f95e932 WatchSource:0}: Error finding container b43fad9712036fe45ab16d01c21399655625c7a70ffea8db00bcaa7f5f95e932: Status 404 returned error can't find the container with id b43fad9712036fe45ab16d01c21399655625c7a70ffea8db00bcaa7f5f95e932 Apr 25 00:01:28.516155 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:28.516116 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-c8wnm" event={"ID":"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f","Type":"ContainerStarted","Data":"b43fad9712036fe45ab16d01c21399655625c7a70ffea8db00bcaa7f5f95e932"} Apr 25 00:01:30.523426 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:30.523390 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-c8wnm" event={"ID":"51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f","Type":"ContainerStarted","Data":"fa871e5f55f167dde203c3fa641e659dc946cbfe7bc299664d3b8001de15e1b4"} Apr 25 00:01:30.539774 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:01:30.539726 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-c8wnm" podStartSLOduration=1.142956322 podStartE2EDuration="3.539711999s" podCreationTimestamp="2026-04-25 00:01:27 +0000 UTC" firstStartedPulling="2026-04-25 00:01:27.926806587 +0000 UTC m=+467.568721438" lastFinishedPulling="2026-04-25 00:01:30.323562264 +0000 UTC m=+469.965477115" observedRunningTime="2026-04-25 00:01:30.537825741 +0000 UTC m=+470.179740597" watchObservedRunningTime="2026-04-25 00:01:30.539711999 +0000 UTC m=+470.181626870" Apr 25 00:03:15.776678 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.776644 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-kvvnd"] Apr 25 00:03:15.778988 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.778972 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvvnd" Apr 25 00:03:15.781633 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.781610 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 25 00:03:15.781745 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.781649 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-j8qqq\"" Apr 25 00:03:15.781745 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.781610 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 25 00:03:15.782612 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.782593 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 25 00:03:15.786290 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.786074 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kvvnd"] Apr 25 00:03:15.830307 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.830275 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-877gl\" (UniqueName: \"kubernetes.io/projected/96cb7b04-a754-42fc-8678-8630e199a460-kube-api-access-877gl\") pod \"s3-init-kvvnd\" (UID: \"96cb7b04-a754-42fc-8678-8630e199a460\") " pod="kserve/s3-init-kvvnd" Apr 25 00:03:15.931650 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.931615 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-877gl\" (UniqueName: \"kubernetes.io/projected/96cb7b04-a754-42fc-8678-8630e199a460-kube-api-access-877gl\") pod \"s3-init-kvvnd\" (UID: \"96cb7b04-a754-42fc-8678-8630e199a460\") " pod="kserve/s3-init-kvvnd" Apr 25 00:03:15.940248 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:15.940212 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-877gl\" (UniqueName: \"kubernetes.io/projected/96cb7b04-a754-42fc-8678-8630e199a460-kube-api-access-877gl\") pod \"s3-init-kvvnd\" (UID: \"96cb7b04-a754-42fc-8678-8630e199a460\") " pod="kserve/s3-init-kvvnd" Apr 25 00:03:16.088559 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:16.088488 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvvnd" Apr 25 00:03:16.212981 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:16.212899 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kvvnd"] Apr 25 00:03:16.215635 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:03:16.215604 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96cb7b04_a754_42fc_8678_8630e199a460.slice/crio-c77375e8ef0b954f07c464856b9ff7fd995938da21b13c2448652a40ffd8ea3c WatchSource:0}: Error finding container c77375e8ef0b954f07c464856b9ff7fd995938da21b13c2448652a40ffd8ea3c: Status 404 returned error can't find the container with id c77375e8ef0b954f07c464856b9ff7fd995938da21b13c2448652a40ffd8ea3c Apr 25 00:03:16.879519 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:16.879401 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvvnd" event={"ID":"96cb7b04-a754-42fc-8678-8630e199a460","Type":"ContainerStarted","Data":"c77375e8ef0b954f07c464856b9ff7fd995938da21b13c2448652a40ffd8ea3c"} Apr 25 00:03:20.900811 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:20.900722 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvvnd" event={"ID":"96cb7b04-a754-42fc-8678-8630e199a460","Type":"ContainerStarted","Data":"a429cf44f6cfca739f7e6292b8267adc548237e7b69b2107a285fb2421b00925"} Apr 25 00:03:20.916889 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:20.916841 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-kvvnd" podStartSLOduration=1.542484416 podStartE2EDuration="5.916824021s" podCreationTimestamp="2026-04-25 00:03:15 +0000 UTC" firstStartedPulling="2026-04-25 00:03:16.217357068 +0000 UTC m=+575.859271919" lastFinishedPulling="2026-04-25 00:03:20.591696669 +0000 UTC m=+580.233611524" observedRunningTime="2026-04-25 00:03:20.915540989 +0000 UTC m=+580.557455875" watchObservedRunningTime="2026-04-25 00:03:20.916824021 +0000 UTC m=+580.558739003" Apr 25 00:03:23.911925 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:23.911887 2559 generic.go:358] "Generic (PLEG): container finished" podID="96cb7b04-a754-42fc-8678-8630e199a460" containerID="a429cf44f6cfca739f7e6292b8267adc548237e7b69b2107a285fb2421b00925" exitCode=0 Apr 25 00:03:23.912313 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:23.911958 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvvnd" event={"ID":"96cb7b04-a754-42fc-8678-8630e199a460","Type":"ContainerDied","Data":"a429cf44f6cfca739f7e6292b8267adc548237e7b69b2107a285fb2421b00925"} Apr 25 00:03:25.046980 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.046956 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvvnd" Apr 25 00:03:25.094227 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.094194 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-877gl\" (UniqueName: \"kubernetes.io/projected/96cb7b04-a754-42fc-8678-8630e199a460-kube-api-access-877gl\") pod \"96cb7b04-a754-42fc-8678-8630e199a460\" (UID: \"96cb7b04-a754-42fc-8678-8630e199a460\") " Apr 25 00:03:25.096214 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.096192 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96cb7b04-a754-42fc-8678-8630e199a460-kube-api-access-877gl" (OuterVolumeSpecName: "kube-api-access-877gl") pod "96cb7b04-a754-42fc-8678-8630e199a460" (UID: "96cb7b04-a754-42fc-8678-8630e199a460"). InnerVolumeSpecName "kube-api-access-877gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:03:25.195754 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.195668 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-877gl\" (UniqueName: \"kubernetes.io/projected/96cb7b04-a754-42fc-8678-8630e199a460-kube-api-access-877gl\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:03:25.919246 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.919217 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kvvnd" Apr 25 00:03:25.919415 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.919215 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kvvnd" event={"ID":"96cb7b04-a754-42fc-8678-8630e199a460","Type":"ContainerDied","Data":"c77375e8ef0b954f07c464856b9ff7fd995938da21b13c2448652a40ffd8ea3c"} Apr 25 00:03:25.919415 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:25.919332 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c77375e8ef0b954f07c464856b9ff7fd995938da21b13c2448652a40ffd8ea3c" Apr 25 00:03:35.895111 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.895076 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v"] Apr 25 00:03:35.896053 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.896025 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96cb7b04-a754-42fc-8678-8630e199a460" containerName="s3-init" Apr 25 00:03:35.896053 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.896048 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="96cb7b04-a754-42fc-8678-8630e199a460" containerName="s3-init" Apr 25 00:03:35.896246 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.896101 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="96cb7b04-a754-42fc-8678-8630e199a460" containerName="s3-init" Apr 25 00:03:35.898952 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.898931 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.901830 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.901379 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:03:35.901830 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.901493 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:03:35.901830 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.901607 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-rgwzr\"" Apr 25 00:03:35.902345 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.902321 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 25 00:03:35.909055 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.909030 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v"] Apr 25 00:03:35.990700 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990658 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwcbg\" (UniqueName: \"kubernetes.io/projected/6fa173d8-6bc5-4153-af7d-f309faae03b5-kube-api-access-jwcbg\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990700 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990701 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990728 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990770 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990787 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6fa173d8-6bc5-4153-af7d-f309faae03b5-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990824 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990856 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990896 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:35.990931 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:35.990912 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092173 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwcbg\" (UniqueName: \"kubernetes.io/projected/6fa173d8-6bc5-4153-af7d-f309faae03b5-kube-api-access-jwcbg\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092396 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092232 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092396 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092275 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092396 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092311 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092396 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092336 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6fa173d8-6bc5-4153-af7d-f309faae03b5-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092396 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092381 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092422 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092489 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092517 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092639 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092679 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.092750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092735 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.093026 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.092904 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.093269 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.093244 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6fa173d8-6bc5-4153-af7d-f309faae03b5-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.094762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.094738 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.095107 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.095088 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.100095 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.100073 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6fa173d8-6bc5-4153-af7d-f309faae03b5-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.100625 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.100608 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwcbg\" (UniqueName: \"kubernetes.io/projected/6fa173d8-6bc5-4153-af7d-f309faae03b5-kube-api-access-jwcbg\") pod \"router-gateway-1-openshift-default-6c59fbf55c-kgh6v\" (UID: \"6fa173d8-6bc5-4153-af7d-f309faae03b5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.211749 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.211713 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:36.350451 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.350425 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v"] Apr 25 00:03:36.353109 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:03:36.353075 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa173d8_6bc5_4153_af7d_f309faae03b5.slice/crio-504f29479c1b21388861eaa30d5b044d4057f4c6f67340f6006758a160148b99 WatchSource:0}: Error finding container 504f29479c1b21388861eaa30d5b044d4057f4c6f67340f6006758a160148b99: Status 404 returned error can't find the container with id 504f29479c1b21388861eaa30d5b044d4057f4c6f67340f6006758a160148b99 Apr 25 00:03:36.355073 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.355040 2559 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 25 00:03:36.355186 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.355105 2559 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 25 00:03:36.355186 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.355134 2559 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 25 00:03:36.954974 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.954938 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" event={"ID":"6fa173d8-6bc5-4153-af7d-f309faae03b5","Type":"ContainerStarted","Data":"95bbaa04dcb959ef7ea69a077f382b6faff4498a64c9a707f88b99645e2eeb18"} Apr 25 00:03:36.954974 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.954975 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" event={"ID":"6fa173d8-6bc5-4153-af7d-f309faae03b5","Type":"ContainerStarted","Data":"504f29479c1b21388861eaa30d5b044d4057f4c6f67340f6006758a160148b99"} Apr 25 00:03:36.975425 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:36.975370 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" podStartSLOduration=1.975354838 podStartE2EDuration="1.975354838s" podCreationTimestamp="2026-04-25 00:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:03:36.972969374 +0000 UTC m=+596.614884248" watchObservedRunningTime="2026-04-25 00:03:36.975354838 +0000 UTC m=+596.617269711" Apr 25 00:03:37.212080 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:37.211989 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:37.216650 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:37.216626 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:37.958670 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:37.958642 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:37.959773 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:37.959755 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-kgh6v" Apr 25 00:03:40.883132 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:40.883102 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:03:40.883602 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:40.883562 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:03:40.889615 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:40.889593 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:03:40.889936 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:40.889915 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:03:46.263809 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.263770 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7"] Apr 25 00:03:46.267999 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.267975 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.270496 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.270470 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-jc76t\"" Apr 25 00:03:46.270496 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.270487 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:03:46.270691 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.270470 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 25 00:03:46.278002 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.277976 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7"] Apr 25 00:03:46.386493 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.386428 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286st\" (UniqueName: \"kubernetes.io/projected/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kube-api-access-286st\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.386681 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.386556 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.386681 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.386604 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.386681 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.386626 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.386681 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.386642 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.386848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.386731 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.487810 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.487764 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488007 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.487837 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286st\" (UniqueName: \"kubernetes.io/projected/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kube-api-access-286st\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488007 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.487890 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488007 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.487926 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488007 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.487954 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488007 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.487977 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488258 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.488231 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.488312 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488388 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.488328 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.488443 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.488413 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.490364 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.490346 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.495699 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.495680 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-286st\" (UniqueName: \"kubernetes.io/projected/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kube-api-access-286st\") pod \"scheduler-inline-config-test-kserve-router-scheduler-579d95mft7\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.579183 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.579081 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:03:46.710444 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.710163 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7"] Apr 25 00:03:46.713112 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:03:46.713087 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod484d0e3d_a2d7_4616_a9a5_9fdfb7091ea3.slice/crio-53e13fa7b954059a937447eb068555654b9443756b06e087ed1232dab2325fdd WatchSource:0}: Error finding container 53e13fa7b954059a937447eb068555654b9443756b06e087ed1232dab2325fdd: Status 404 returned error can't find the container with id 53e13fa7b954059a937447eb068555654b9443756b06e087ed1232dab2325fdd Apr 25 00:03:46.987368 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:46.987333 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerStarted","Data":"53e13fa7b954059a937447eb068555654b9443756b06e087ed1232dab2325fdd"} Apr 25 00:03:51.014814 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:51.014773 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerStarted","Data":"664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc"} Apr 25 00:03:52.018973 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:52.018942 2559 generic.go:358] "Generic (PLEG): container finished" podID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerID="664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc" exitCode=0 Apr 25 00:03:52.019356 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:52.019034 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerDied","Data":"664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc"} Apr 25 00:03:54.035357 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:03:54.035321 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerStarted","Data":"d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15"} Apr 25 00:04:25.147082 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:25.147047 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerStarted","Data":"8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764"} Apr 25 00:04:25.147621 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:25.147335 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:04:25.149959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:25.149938 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:04:25.169040 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:25.168983 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" podStartSLOduration=1.469927438 podStartE2EDuration="39.168968088s" podCreationTimestamp="2026-04-25 00:03:46 +0000 UTC" firstStartedPulling="2026-04-25 00:03:46.714927711 +0000 UTC m=+606.356842561" lastFinishedPulling="2026-04-25 00:04:24.41396836 +0000 UTC m=+644.055883211" observedRunningTime="2026-04-25 00:04:25.16729174 +0000 UTC m=+644.809206614" watchObservedRunningTime="2026-04-25 00:04:25.168968088 +0000 UTC m=+644.810882960" Apr 25 00:04:26.579383 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:26.579346 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:04:26.579383 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:26.579384 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:04:36.581258 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:36.581227 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:04:36.582565 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:04:36.582545 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:05:01.691619 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:01.691578 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7"] Apr 25 00:05:01.692057 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:01.691975 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="main" containerID="cri-o://d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15" gracePeriod=30 Apr 25 00:05:01.692128 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:01.692035 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="tokenizer" containerID="cri-o://8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764" gracePeriod=30 Apr 25 00:05:02.273715 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.273675 2559 generic.go:358] "Generic (PLEG): container finished" podID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerID="d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15" exitCode=0 Apr 25 00:05:02.273907 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.273747 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerDied","Data":"d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15"} Apr 25 00:05:02.953186 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.953164 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:05:02.994790 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.994753 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-286st\" (UniqueName: \"kubernetes.io/projected/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kube-api-access-286st\") pod \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " Apr 25 00:05:02.994951 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.994808 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-tmp\") pod \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " Apr 25 00:05:02.994951 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.994850 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kserve-provision-location\") pod \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " Apr 25 00:05:02.994951 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.994900 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-uds\") pod \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " Apr 25 00:05:02.994951 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.994925 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-cache\") pod \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " Apr 25 00:05:02.995167 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.994963 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tls-certs\") pod \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\" (UID: \"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3\") " Apr 25 00:05:02.995224 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.995179 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" (UID: "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.995224 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.995200 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" (UID: "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.995551 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.995366 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.995551 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.995393 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.995551 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.995505 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" (UID: "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.995775 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.995745 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" (UID: "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.997291 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.997265 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" (UID: "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:05:02.997391 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:02.997311 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kube-api-access-286st" (OuterVolumeSpecName: "kube-api-access-286st") pod "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" (UID: "484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3"). InnerVolumeSpecName "kube-api-access-286st". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:05:03.096569 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.096445 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:03.096569 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.096514 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:03.096569 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.096525 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-286st\" (UniqueName: \"kubernetes.io/projected/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-kube-api-access-286st\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:03.096569 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.096535 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:03.278712 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.278680 2559 generic.go:358] "Generic (PLEG): container finished" podID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerID="8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764" exitCode=0 Apr 25 00:05:03.278872 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.278720 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerDied","Data":"8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764"} Apr 25 00:05:03.278872 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.278742 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" event={"ID":"484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3","Type":"ContainerDied","Data":"53e13fa7b954059a937447eb068555654b9443756b06e087ed1232dab2325fdd"} Apr 25 00:05:03.278872 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.278759 2559 scope.go:117] "RemoveContainer" containerID="8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764" Apr 25 00:05:03.278872 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.278760 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7" Apr 25 00:05:03.287251 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.287236 2559 scope.go:117] "RemoveContainer" containerID="d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15" Apr 25 00:05:03.294327 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.294308 2559 scope.go:117] "RemoveContainer" containerID="664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc" Apr 25 00:05:03.300580 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.300451 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7"] Apr 25 00:05:03.301578 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.301558 2559 scope.go:117] "RemoveContainer" containerID="8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764" Apr 25 00:05:03.301797 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:05:03.301779 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764\": container with ID starting with 8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764 not found: ID does not exist" containerID="8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764" Apr 25 00:05:03.301852 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.301805 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764"} err="failed to get container status \"8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764\": rpc error: code = NotFound desc = could not find container \"8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764\": container with ID starting with 8e1ec7726d8c2f8bb3f1a89b138e237dbac25e6b5b3986164655d73a954f2764 not found: ID does not exist" Apr 25 00:05:03.301852 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.301821 2559 scope.go:117] "RemoveContainer" containerID="d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15" Apr 25 00:05:03.302075 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:05:03.302050 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15\": container with ID starting with d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15 not found: ID does not exist" containerID="d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15" Apr 25 00:05:03.302167 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.302083 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15"} err="failed to get container status \"d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15\": rpc error: code = NotFound desc = could not find container \"d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15\": container with ID starting with d474d2f10cc48db85c5b0dfed6c4c7c3d88d8dee99cadaff05a0cf26205c5d15 not found: ID does not exist" Apr 25 00:05:03.302167 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.302106 2559 scope.go:117] "RemoveContainer" containerID="664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc" Apr 25 00:05:03.302697 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:05:03.302529 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc\": container with ID starting with 664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc not found: ID does not exist" containerID="664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc" Apr 25 00:05:03.302697 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.302578 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc"} err="failed to get container status \"664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc\": rpc error: code = NotFound desc = could not find container \"664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc\": container with ID starting with 664ab9c970605375d88701a24e88a63b3bed333b32b1f14c3fda2ef169de51dc not found: ID does not exist" Apr 25 00:05:03.304355 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:03.304331 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-579d95mft7"] Apr 25 00:05:04.937172 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:04.937139 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" path="/var/lib/kubelet/pods/484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3/volumes" Apr 25 00:05:13.881427 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881388 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n"] Apr 25 00:05:13.881935 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881911 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="storage-initializer" Apr 25 00:05:13.882032 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881937 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="storage-initializer" Apr 25 00:05:13.882032 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881960 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="main" Apr 25 00:05:13.882032 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881969 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="main" Apr 25 00:05:13.882032 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881987 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="tokenizer" Apr 25 00:05:13.882032 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.881997 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="tokenizer" Apr 25 00:05:13.882190 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.882076 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="main" Apr 25 00:05:13.882190 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.882094 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="484d0e3d-a2d7-4616-a9a5-9fdfb7091ea3" containerName="tokenizer" Apr 25 00:05:13.887161 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.887142 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:13.889359 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.889338 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 25 00:05:13.890181 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.890154 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-9k5tw\"" Apr 25 00:05:13.890273 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.890206 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:05:13.892855 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.892835 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n"] Apr 25 00:05:13.988786 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.988747 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:13.988786 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.988783 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:13.989011 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.988803 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:13.989011 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.988832 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vmwh\" (UniqueName: \"kubernetes.io/projected/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kube-api-access-9vmwh\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:13.989011 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.988885 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:13.989011 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:13.988978 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.089727 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.089690 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.089926 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.089737 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.089926 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.089753 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.089926 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.089773 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.089926 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.089886 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vmwh\" (UniqueName: \"kubernetes.io/projected/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kube-api-access-9vmwh\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.090151 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.089997 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.090151 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.090096 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.090263 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.090167 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.090263 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.090213 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.090375 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.090351 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.092244 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.092219 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.096936 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.096918 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vmwh\" (UniqueName: \"kubernetes.io/projected/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kube-api-access-9vmwh\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.197812 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.197774 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:14.324355 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.324322 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n"] Apr 25 00:05:14.327447 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:05:14.327424 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5030e4c_e7d6_4a73_baf0_7a3cc38b12a1.slice/crio-8e496c62d62ffac1a4a58f3ff304d24c04c328351918c50769bb02ca15804256 WatchSource:0}: Error finding container 8e496c62d62ffac1a4a58f3ff304d24c04c328351918c50769bb02ca15804256: Status 404 returned error can't find the container with id 8e496c62d62ffac1a4a58f3ff304d24c04c328351918c50769bb02ca15804256 Apr 25 00:05:14.329129 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:14.329113 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:05:15.324728 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:15.324697 2559 generic.go:358] "Generic (PLEG): container finished" podID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerID="2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334" exitCode=0 Apr 25 00:05:15.325082 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:15.324756 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerDied","Data":"2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334"} Apr 25 00:05:15.325082 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:15.324797 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerStarted","Data":"8e496c62d62ffac1a4a58f3ff304d24c04c328351918c50769bb02ca15804256"} Apr 25 00:05:16.329748 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:16.329720 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerStarted","Data":"c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec"} Apr 25 00:05:16.329748 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:16.329750 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerStarted","Data":"b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d"} Apr 25 00:05:16.330175 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:16.329870 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:16.351611 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:16.351555 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" podStartSLOduration=3.35153504 podStartE2EDuration="3.35153504s" podCreationTimestamp="2026-04-25 00:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:05:16.348993989 +0000 UTC m=+695.990908884" watchObservedRunningTime="2026-04-25 00:05:16.35153504 +0000 UTC m=+695.993449915" Apr 25 00:05:24.197956 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:24.197919 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:24.197956 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:24.197967 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:24.200668 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:24.200644 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:24.357239 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:24.357208 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:45.363751 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:45.363721 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:46.378009 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:46.377970 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n"] Apr 25 00:05:46.378425 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:46.378386 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="main" containerID="cri-o://b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d" gracePeriod=30 Apr 25 00:05:46.378509 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:46.378425 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="tokenizer" containerID="cri-o://c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec" gracePeriod=30 Apr 25 00:05:47.442193 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.442163 2559 generic.go:358] "Generic (PLEG): container finished" podID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerID="b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d" exitCode=0 Apr 25 00:05:47.442529 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.442203 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerDied","Data":"b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d"} Apr 25 00:05:47.621473 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.621443 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:47.781686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781579 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-cache\") pod \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " Apr 25 00:05:47.781686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781627 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kserve-provision-location\") pod \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " Apr 25 00:05:47.781686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781656 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tls-certs\") pod \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " Apr 25 00:05:47.781975 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781704 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-uds\") pod \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " Apr 25 00:05:47.781975 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781745 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-tmp\") pod \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " Apr 25 00:05:47.781975 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781786 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vmwh\" (UniqueName: \"kubernetes.io/projected/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kube-api-access-9vmwh\") pod \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\" (UID: \"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1\") " Apr 25 00:05:47.781975 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781892 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" (UID: "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:47.782126 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.781987 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" (UID: "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:47.782126 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.782077 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:47.782198 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.782135 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" (UID: "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:47.782359 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.782340 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" (UID: "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:47.783874 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.783853 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kube-api-access-9vmwh" (OuterVolumeSpecName: "kube-api-access-9vmwh") pod "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" (UID: "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1"). InnerVolumeSpecName "kube-api-access-9vmwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:05:47.783948 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.783855 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" (UID: "c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:05:47.883339 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.883296 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:47.883339 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.883333 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:47.883339 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.883345 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:47.883611 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.883356 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:47.883611 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:47.883366 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vmwh\" (UniqueName: \"kubernetes.io/projected/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1-kube-api-access-9vmwh\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:05:48.447569 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.447531 2559 generic.go:358] "Generic (PLEG): container finished" podID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerID="c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec" exitCode=0 Apr 25 00:05:48.448008 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.447621 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" Apr 25 00:05:48.448008 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.447621 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerDied","Data":"c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec"} Apr 25 00:05:48.448008 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.447664 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n" event={"ID":"c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1","Type":"ContainerDied","Data":"8e496c62d62ffac1a4a58f3ff304d24c04c328351918c50769bb02ca15804256"} Apr 25 00:05:48.448008 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.447679 2559 scope.go:117] "RemoveContainer" containerID="c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec" Apr 25 00:05:48.456251 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.456231 2559 scope.go:117] "RemoveContainer" containerID="b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d" Apr 25 00:05:48.463515 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.463494 2559 scope.go:117] "RemoveContainer" containerID="2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334" Apr 25 00:05:48.468526 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.468486 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n"] Apr 25 00:05:48.471616 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.471598 2559 scope.go:117] "RemoveContainer" containerID="c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec" Apr 25 00:05:48.471907 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:05:48.471884 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec\": container with ID starting with c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec not found: ID does not exist" containerID="c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec" Apr 25 00:05:48.471966 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.471920 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec"} err="failed to get container status \"c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec\": rpc error: code = NotFound desc = could not find container \"c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec\": container with ID starting with c3c4eb5951ff63aaf47577811cca2e89767ce6aa400af52b874e7387cd10d8ec not found: ID does not exist" Apr 25 00:05:48.471966 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.471948 2559 scope.go:117] "RemoveContainer" containerID="b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d" Apr 25 00:05:48.472056 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.471996 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-575ccnhq5n"] Apr 25 00:05:48.472186 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:05:48.472168 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d\": container with ID starting with b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d not found: ID does not exist" containerID="b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d" Apr 25 00:05:48.472230 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.472194 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d"} err="failed to get container status \"b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d\": rpc error: code = NotFound desc = could not find container \"b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d\": container with ID starting with b71ecd3ed27dc49b3cbe000abbf2bcea356b65debc655961f30cccf1618fad9d not found: ID does not exist" Apr 25 00:05:48.472230 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.472211 2559 scope.go:117] "RemoveContainer" containerID="2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334" Apr 25 00:05:48.472430 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:05:48.472416 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334\": container with ID starting with 2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334 not found: ID does not exist" containerID="2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334" Apr 25 00:05:48.472488 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.472433 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334"} err="failed to get container status \"2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334\": rpc error: code = NotFound desc = could not find container \"2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334\": container with ID starting with 2e181635fca21f8cb088704d731ed8823a60f27eae9c0a8df544fe6c0f1da334 not found: ID does not exist" Apr 25 00:05:48.936601 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:05:48.936569 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" path="/var/lib/kubelet/pods/c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1/volumes" Apr 25 00:06:01.628819 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.628785 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq"] Apr 25 00:06:01.629263 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629218 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="storage-initializer" Apr 25 00:06:01.629263 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629235 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="storage-initializer" Apr 25 00:06:01.629263 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629247 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="main" Apr 25 00:06:01.629263 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629257 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="main" Apr 25 00:06:01.629497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629275 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="tokenizer" Apr 25 00:06:01.629497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629284 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="tokenizer" Apr 25 00:06:01.629497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629375 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="tokenizer" Apr 25 00:06:01.629497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.629390 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5030e4c-e7d6-4a73-baf0-7a3cc38b12a1" containerName="main" Apr 25 00:06:01.634674 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.634654 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.637180 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.637155 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-db5td\"" Apr 25 00:06:01.637310 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.637255 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 25 00:06:01.637612 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.637596 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:06:01.641639 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.641620 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq"] Apr 25 00:06:01.693675 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.693641 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.693675 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.693678 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.693965 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.693700 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gd9w\" (UniqueName: \"kubernetes.io/projected/9d42610e-0191-479f-b092-849da25e9d7d-kube-api-access-7gd9w\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.693965 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.693824 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d42610e-0191-479f-b092-849da25e9d7d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.693965 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.693884 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.693965 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.693937 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794381 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794344 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d42610e-0191-479f-b092-849da25e9d7d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794394 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794425 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794488 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794511 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794536 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gd9w\" (UniqueName: \"kubernetes.io/projected/9d42610e-0191-479f-b092-849da25e9d7d-kube-api-access-7gd9w\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794893 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794823 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794945 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794903 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.794945 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794934 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.795040 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.794972 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.796919 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.796892 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d42610e-0191-479f-b092-849da25e9d7d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.805035 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.805013 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gd9w\" (UniqueName: \"kubernetes.io/projected/9d42610e-0191-479f-b092-849da25e9d7d-kube-api-access-7gd9w\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:01.945372 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:01.945333 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:02.072446 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:02.072387 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq"] Apr 25 00:06:02.075373 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:06:02.075346 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d42610e_0191_479f_b092_849da25e9d7d.slice/crio-2f1ba8a40c1373a0a6d449bacd4c7c11335fc8f548ae215071ce6c0a1c8ba0c2 WatchSource:0}: Error finding container 2f1ba8a40c1373a0a6d449bacd4c7c11335fc8f548ae215071ce6c0a1c8ba0c2: Status 404 returned error can't find the container with id 2f1ba8a40c1373a0a6d449bacd4c7c11335fc8f548ae215071ce6c0a1c8ba0c2 Apr 25 00:06:02.501097 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:02.501064 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerStarted","Data":"1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972"} Apr 25 00:06:02.501097 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:02.501100 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerStarted","Data":"2f1ba8a40c1373a0a6d449bacd4c7c11335fc8f548ae215071ce6c0a1c8ba0c2"} Apr 25 00:06:03.506008 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:03.505975 2559 generic.go:358] "Generic (PLEG): container finished" podID="9d42610e-0191-479f-b092-849da25e9d7d" containerID="1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972" exitCode=0 Apr 25 00:06:03.506382 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:03.506029 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerDied","Data":"1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972"} Apr 25 00:06:04.512266 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:04.512229 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerStarted","Data":"e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c"} Apr 25 00:06:04.512266 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:04.512265 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerStarted","Data":"794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c"} Apr 25 00:06:04.512940 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:04.512374 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:04.533983 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:04.533936 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" podStartSLOduration=3.533920588 podStartE2EDuration="3.533920588s" podCreationTimestamp="2026-04-25 00:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:06:04.532455665 +0000 UTC m=+744.174370539" watchObservedRunningTime="2026-04-25 00:06:04.533920588 +0000 UTC m=+744.175835460" Apr 25 00:06:11.946004 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:11.945968 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:11.946004 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:11.946008 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:11.948820 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:11.948792 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:12.541428 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:12.541397 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:33.545345 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:33.545312 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:06:58.075081 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.075044 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb"] Apr 25 00:06:58.087794 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.087533 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.090301 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.090273 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb"] Apr 25 00:06:58.090628 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.090596 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 25 00:06:58.090768 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.090749 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-cjzmb\"" Apr 25 00:06:58.155004 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.154967 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.155174 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.155022 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6j5\" (UniqueName: \"kubernetes.io/projected/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kube-api-access-kd6j5\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.155174 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.155073 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.155174 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.155120 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.155174 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.155147 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.155313 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.155213 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256138 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256099 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256335 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256148 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256335 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256179 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6j5\" (UniqueName: \"kubernetes.io/projected/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kube-api-access-kd6j5\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256335 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256199 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256335 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256225 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256335 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256247 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256648 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256606 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256641 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256666 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.256773 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.256704 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.258805 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.258783 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.265412 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.265384 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6j5\" (UniqueName: \"kubernetes.io/projected/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kube-api-access-kd6j5\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.398059 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.397978 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:06:58.568031 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.567995 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb"] Apr 25 00:06:58.571109 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:06:58.571080 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e932d6_cd4a_4d21_9d4f_8d35c18ef0e3.slice/crio-1d38486ed7b8db8f4bc907d0c7f1c9a743b1dc97c419bfbd0c7c2dc0e9a41edf WatchSource:0}: Error finding container 1d38486ed7b8db8f4bc907d0c7f1c9a743b1dc97c419bfbd0c7c2dc0e9a41edf: Status 404 returned error can't find the container with id 1d38486ed7b8db8f4bc907d0c7f1c9a743b1dc97c419bfbd0c7c2dc0e9a41edf Apr 25 00:06:58.695656 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.695613 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerStarted","Data":"633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0"} Apr 25 00:06:58.695656 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:58.695658 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerStarted","Data":"1d38486ed7b8db8f4bc907d0c7f1c9a743b1dc97c419bfbd0c7c2dc0e9a41edf"} Apr 25 00:06:59.701427 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:59.701393 2559 generic.go:358] "Generic (PLEG): container finished" podID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerID="633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0" exitCode=0 Apr 25 00:06:59.701810 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:06:59.701489 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerDied","Data":"633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0"} Apr 25 00:07:00.707245 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:00.707214 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerStarted","Data":"b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd"} Apr 25 00:07:00.707245 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:00.707248 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerStarted","Data":"910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b"} Apr 25 00:07:00.707684 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:00.707356 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:07:00.726763 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:00.726709 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" podStartSLOduration=2.726691599 podStartE2EDuration="2.726691599s" podCreationTimestamp="2026-04-25 00:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:07:00.725285008 +0000 UTC m=+800.367199885" watchObservedRunningTime="2026-04-25 00:07:00.726691599 +0000 UTC m=+800.368606476" Apr 25 00:07:08.398488 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:08.398427 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:07:08.398926 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:08.398510 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:07:08.401139 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:08.401116 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:07:08.734632 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:08.734599 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:07:29.738701 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:29.738670 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:07:52.852284 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:52.852251 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq"] Apr 25 00:07:52.852781 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:52.852598 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="main" containerID="cri-o://794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c" gracePeriod=30 Apr 25 00:07:52.852781 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:52.852660 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="tokenizer" containerID="cri-o://e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c" gracePeriod=30 Apr 25 00:07:53.544864 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:07:53.544836 2559 logging.go:55] [core] [Channel #134 SubChannel #135]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.32:9003", ServerName: "10.132.0.32:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.32:9003: connect: connection refused" Apr 25 00:07:53.889349 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:53.889267 2559 generic.go:358] "Generic (PLEG): container finished" podID="9d42610e-0191-479f-b092-849da25e9d7d" containerID="794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c" exitCode=0 Apr 25 00:07:53.889349 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:53.889317 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerDied","Data":"794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c"} Apr 25 00:07:54.092245 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.092224 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:07:54.232488 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232409 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-tmp\") pod \"9d42610e-0191-479f-b092-849da25e9d7d\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " Apr 25 00:07:54.232686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232509 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d42610e-0191-479f-b092-849da25e9d7d-tls-certs\") pod \"9d42610e-0191-479f-b092-849da25e9d7d\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " Apr 25 00:07:54.232686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232545 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-uds\") pod \"9d42610e-0191-479f-b092-849da25e9d7d\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " Apr 25 00:07:54.232686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232583 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-kserve-provision-location\") pod \"9d42610e-0191-479f-b092-849da25e9d7d\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " Apr 25 00:07:54.232686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232602 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-cache\") pod \"9d42610e-0191-479f-b092-849da25e9d7d\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " Apr 25 00:07:54.232686 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232628 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gd9w\" (UniqueName: \"kubernetes.io/projected/9d42610e-0191-479f-b092-849da25e9d7d-kube-api-access-7gd9w\") pod \"9d42610e-0191-479f-b092-849da25e9d7d\" (UID: \"9d42610e-0191-479f-b092-849da25e9d7d\") " Apr 25 00:07:54.232959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232794 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9d42610e-0191-479f-b092-849da25e9d7d" (UID: "9d42610e-0191-479f-b092-849da25e9d7d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.232959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232810 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9d42610e-0191-479f-b092-849da25e9d7d" (UID: "9d42610e-0191-479f-b092-849da25e9d7d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.232959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232893 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9d42610e-0191-479f-b092-849da25e9d7d" (UID: "9d42610e-0191-479f-b092-849da25e9d7d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.232959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232927 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.232959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.232939 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.233331 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.233303 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d42610e-0191-479f-b092-849da25e9d7d" (UID: "9d42610e-0191-479f-b092-849da25e9d7d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.234724 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.234697 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d42610e-0191-479f-b092-849da25e9d7d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9d42610e-0191-479f-b092-849da25e9d7d" (UID: "9d42610e-0191-479f-b092-849da25e9d7d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:07:54.234818 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.234769 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d42610e-0191-479f-b092-849da25e9d7d-kube-api-access-7gd9w" (OuterVolumeSpecName: "kube-api-access-7gd9w") pod "9d42610e-0191-479f-b092-849da25e9d7d" (UID: "9d42610e-0191-479f-b092-849da25e9d7d"). InnerVolumeSpecName "kube-api-access-7gd9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:07:54.333782 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.333742 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d42610e-0191-479f-b092-849da25e9d7d-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.333782 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.333777 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.333782 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.333788 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d42610e-0191-479f-b092-849da25e9d7d-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.334011 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.333798 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gd9w\" (UniqueName: \"kubernetes.io/projected/9d42610e-0191-479f-b092-849da25e9d7d-kube-api-access-7gd9w\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.545647 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.545543 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.32:9003\" within 1s: context deadline exceeded" Apr 25 00:07:54.545647 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:07:54.545612 2559 logging.go:55] [core] [Channel #134 SubChannel #135]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.32:9003", ServerName: "10.132.0.32:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.32:9003: operation was canceled" Apr 25 00:07:54.894002 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.893917 2559 generic.go:358] "Generic (PLEG): container finished" podID="9d42610e-0191-479f-b092-849da25e9d7d" containerID="e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c" exitCode=0 Apr 25 00:07:54.894002 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.893953 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerDied","Data":"e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c"} Apr 25 00:07:54.894002 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.893995 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" event={"ID":"9d42610e-0191-479f-b092-849da25e9d7d","Type":"ContainerDied","Data":"2f1ba8a40c1373a0a6d449bacd4c7c11335fc8f548ae215071ce6c0a1c8ba0c2"} Apr 25 00:07:54.894002 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.893998 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq" Apr 25 00:07:54.894669 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.894010 2559 scope.go:117] "RemoveContainer" containerID="e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c" Apr 25 00:07:54.904636 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.903751 2559 scope.go:117] "RemoveContainer" containerID="794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c" Apr 25 00:07:54.912179 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.912158 2559 scope.go:117] "RemoveContainer" containerID="1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972" Apr 25 00:07:54.920078 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.920058 2559 scope.go:117] "RemoveContainer" containerID="e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c" Apr 25 00:07:54.920316 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:07:54.920297 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c\": container with ID starting with e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c not found: ID does not exist" containerID="e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c" Apr 25 00:07:54.920361 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.920324 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c"} err="failed to get container status \"e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c\": rpc error: code = NotFound desc = could not find container \"e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c\": container with ID starting with e91b1cb1f36c7e06354510e8e82ebfb337e6465521141597e42430c4086d828c not found: ID does not exist" Apr 25 00:07:54.920361 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.920340 2559 scope.go:117] "RemoveContainer" containerID="794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c" Apr 25 00:07:54.920622 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:07:54.920605 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c\": container with ID starting with 794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c not found: ID does not exist" containerID="794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c" Apr 25 00:07:54.920695 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.920629 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c"} err="failed to get container status \"794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c\": rpc error: code = NotFound desc = could not find container \"794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c\": container with ID starting with 794efb6a565eb7942e96ae0a9a749a19f19879a5ce06cec19204e7043dacd37c not found: ID does not exist" Apr 25 00:07:54.920695 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.920651 2559 scope.go:117] "RemoveContainer" containerID="1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972" Apr 25 00:07:54.920926 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:07:54.920908 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972\": container with ID starting with 1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972 not found: ID does not exist" containerID="1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972" Apr 25 00:07:54.920981 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.920934 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972"} err="failed to get container status \"1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972\": rpc error: code = NotFound desc = could not find container \"1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972\": container with ID starting with 1235b9676350a06a31f52275e10d6dfd5ad6e4dce5295b4a099e60dc47ed3972 not found: ID does not exist" Apr 25 00:07:54.921041 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.921022 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq"] Apr 25 00:07:54.924640 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.924614 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c67hcrq"] Apr 25 00:07:54.937013 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:07:54.936982 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d42610e-0191-479f-b092-849da25e9d7d" path="/var/lib/kubelet/pods/9d42610e-0191-479f-b092-849da25e9d7d/volumes" Apr 25 00:08:40.905218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:08:40.905185 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:08:40.906206 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:08:40.906184 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:08:40.911104 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:08:40.911084 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:08:40.911927 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:08:40.911912 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:09:36.333729 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:36.333682 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb"] Apr 25 00:09:36.336106 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:36.334022 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="main" containerID="cri-o://910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b" gracePeriod=30 Apr 25 00:09:36.336106 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:36.334066 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="tokenizer" containerID="cri-o://b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd" gracePeriod=30 Apr 25 00:09:37.234744 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.234711 2559 generic.go:358] "Generic (PLEG): container finished" podID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerID="910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b" exitCode=0 Apr 25 00:09:37.234925 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.234758 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerDied","Data":"910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b"} Apr 25 00:09:37.579523 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.579500 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:09:37.721650 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.721614 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kserve-provision-location\") pod \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " Apr 25 00:09:37.721650 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.721651 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tls-certs\") pod \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " Apr 25 00:09:37.721903 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.721701 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-uds\") pod \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " Apr 25 00:09:37.721903 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.721736 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-cache\") pod \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " Apr 25 00:09:37.721903 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.721779 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd6j5\" (UniqueName: \"kubernetes.io/projected/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kube-api-access-kd6j5\") pod \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " Apr 25 00:09:37.721903 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.721804 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-tmp\") pod \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\" (UID: \"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3\") " Apr 25 00:09:37.722111 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.722013 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" (UID: "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:37.722111 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.722048 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" (UID: "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:37.722220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.722157 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" (UID: "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:37.722450 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.722426 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" (UID: "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:37.723858 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.723835 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" (UID: "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:37.723942 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.723869 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kube-api-access-kd6j5" (OuterVolumeSpecName: "kube-api-access-kd6j5") pod "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" (UID: "52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3"). InnerVolumeSpecName "kube-api-access-kd6j5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:37.822762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.822684 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:09:37.822762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.822711 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:09:37.822762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.822721 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:09:37.822762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.822730 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:09:37.822762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.822738 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kd6j5\" (UniqueName: \"kubernetes.io/projected/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-kube-api-access-kd6j5\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:09:37.822762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:37.822746 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:09:38.240369 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.240338 2559 generic.go:358] "Generic (PLEG): container finished" podID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerID="b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd" exitCode=0 Apr 25 00:09:38.240579 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.240390 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerDied","Data":"b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd"} Apr 25 00:09:38.240579 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.240412 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" Apr 25 00:09:38.240579 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.240428 2559 scope.go:117] "RemoveContainer" containerID="b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd" Apr 25 00:09:38.240579 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.240418 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb" event={"ID":"52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3","Type":"ContainerDied","Data":"1d38486ed7b8db8f4bc907d0c7f1c9a743b1dc97c419bfbd0c7c2dc0e9a41edf"} Apr 25 00:09:38.252623 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.252586 2559 scope.go:117] "RemoveContainer" containerID="910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b" Apr 25 00:09:38.260296 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.260274 2559 scope.go:117] "RemoveContainer" containerID="633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0" Apr 25 00:09:38.264508 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.264482 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb"] Apr 25 00:09:38.268904 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.268881 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schefq4nb"] Apr 25 00:09:38.269387 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.269367 2559 scope.go:117] "RemoveContainer" containerID="b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd" Apr 25 00:09:38.269723 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:09:38.269702 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd\": container with ID starting with b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd not found: ID does not exist" containerID="b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd" Apr 25 00:09:38.269772 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.269736 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd"} err="failed to get container status \"b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd\": rpc error: code = NotFound desc = could not find container \"b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd\": container with ID starting with b22bb27979c62bfbc872bdd043269d0a2422e12c9c54c7e565dd31dfa343a5bd not found: ID does not exist" Apr 25 00:09:38.269772 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.269757 2559 scope.go:117] "RemoveContainer" containerID="910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b" Apr 25 00:09:38.269998 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:09:38.269982 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b\": container with ID starting with 910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b not found: ID does not exist" containerID="910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b" Apr 25 00:09:38.270040 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.270005 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b"} err="failed to get container status \"910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b\": rpc error: code = NotFound desc = could not find container \"910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b\": container with ID starting with 910106f538188e1a53693ae864725cbc1a1f75e86589fcabf3ba884e25b5be8b not found: ID does not exist" Apr 25 00:09:38.270040 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.270022 2559 scope.go:117] "RemoveContainer" containerID="633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0" Apr 25 00:09:38.270226 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:09:38.270211 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0\": container with ID starting with 633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0 not found: ID does not exist" containerID="633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0" Apr 25 00:09:38.270266 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.270229 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0"} err="failed to get container status \"633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0\": rpc error: code = NotFound desc = could not find container \"633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0\": container with ID starting with 633f91ef92cfa9e11bee13163d5e301a50c0d7aec41d8032592584321ea526f0 not found: ID does not exist" Apr 25 00:09:38.936338 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:38.936304 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" path="/var/lib/kubelet/pods/52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3/volumes" Apr 25 00:09:41.135321 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135292 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq"] Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135594 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="main" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135605 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="main" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135613 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="main" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135618 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="main" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135629 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="storage-initializer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135635 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="storage-initializer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135644 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="storage-initializer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135649 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="storage-initializer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135654 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="tokenizer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135659 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="tokenizer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135677 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="tokenizer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135683 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="tokenizer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135728 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="tokenizer" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135736 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="52e932d6-cd4a-4d21-9d4f-8d35c18ef0e3" containerName="main" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135743 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="main" Apr 25 00:09:41.135750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.135749 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d42610e-0191-479f-b092-849da25e9d7d" containerName="tokenizer" Apr 25 00:09:41.140316 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.140288 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.143000 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.142980 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:09:41.143108 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.142997 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-vptcn\"" Apr 25 00:09:41.143745 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.143721 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 25 00:09:41.145071 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.145048 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.145176 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.145092 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.145176 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.145140 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.145284 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.145211 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.145284 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.145251 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.145379 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.145303 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq469\" (UniqueName: \"kubernetes.io/projected/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kube-api-access-vq469\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.150741 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.150720 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq"] Apr 25 00:09:41.246189 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246091 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246189 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246139 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246189 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246169 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246189 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246190 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq469\" (UniqueName: \"kubernetes.io/projected/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kube-api-access-vq469\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246495 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246235 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246495 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246266 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246581 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246561 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246628 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246578 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246670 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246628 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.246710 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.246690 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.248715 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.248694 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.254587 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.254562 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq469\" (UniqueName: \"kubernetes.io/projected/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kube-api-access-vq469\") pod \"custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.450521 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.450480 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:41.577913 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:41.577867 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq"] Apr 25 00:09:41.580744 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:09:41.580709 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7929c1dc_5d4e_4e19_b3c1_858bbc4fc894.slice/crio-2918350e4a842559580755640dd2c9e5c4e00cb8ac03f28804ef70c05c33f219 WatchSource:0}: Error finding container 2918350e4a842559580755640dd2c9e5c4e00cb8ac03f28804ef70c05c33f219: Status 404 returned error can't find the container with id 2918350e4a842559580755640dd2c9e5c4e00cb8ac03f28804ef70c05c33f219 Apr 25 00:09:42.257779 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:42.257744 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerStarted","Data":"e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055"} Apr 25 00:09:42.258162 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:42.257786 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerStarted","Data":"2918350e4a842559580755640dd2c9e5c4e00cb8ac03f28804ef70c05c33f219"} Apr 25 00:09:43.262075 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:43.262042 2559 generic.go:358] "Generic (PLEG): container finished" podID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerID="e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055" exitCode=0 Apr 25 00:09:43.262075 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:43.262080 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerDied","Data":"e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055"} Apr 25 00:09:44.267863 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:44.267824 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerStarted","Data":"54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea"} Apr 25 00:09:44.267863 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:44.267866 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerStarted","Data":"08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f"} Apr 25 00:09:44.268324 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:44.267980 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:51.450942 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:51.450902 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:51.451487 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:51.450956 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:51.453653 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:51.453622 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:09:51.474537 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:51.474485 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" podStartSLOduration=10.474445305 podStartE2EDuration="10.474445305s" podCreationTimestamp="2026-04-25 00:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:09:44.287887637 +0000 UTC m=+963.929802509" watchObservedRunningTime="2026-04-25 00:09:51.474445305 +0000 UTC m=+971.116360177" Apr 25 00:09:52.296171 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:09:52.296134 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:10:13.299947 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:10:13.299866 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:12:37.177009 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:37.176970 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq"] Apr 25 00:12:37.177526 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:37.177381 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="main" containerID="cri-o://08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f" gracePeriod=30 Apr 25 00:12:37.177526 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:37.177427 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="tokenizer" containerID="cri-o://54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea" gracePeriod=30 Apr 25 00:12:37.868593 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:37.868558 2559 generic.go:358] "Generic (PLEG): container finished" podID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerID="08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f" exitCode=0 Apr 25 00:12:37.868770 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:37.868638 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerDied","Data":"08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f"} Apr 25 00:12:38.414116 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.414095 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:12:38.440036 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440010 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-cache\") pod \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " Apr 25 00:12:38.440160 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440076 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kserve-provision-location\") pod \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " Apr 25 00:12:38.440160 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440137 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-uds\") pod \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " Apr 25 00:12:38.440289 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440177 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq469\" (UniqueName: \"kubernetes.io/projected/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kube-api-access-vq469\") pod \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " Apr 25 00:12:38.440289 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440226 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-tmp\") pod \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " Apr 25 00:12:38.440289 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440244 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" (UID: "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:12:38.440289 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440256 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tls-certs\") pod \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\" (UID: \"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894\") " Apr 25 00:12:38.440505 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440354 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" (UID: "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:12:38.440556 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440503 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" (UID: "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:12:38.440597 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440567 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:12:38.440597 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440579 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:12:38.440667 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440594 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:12:38.440830 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.440807 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" (UID: "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:12:38.442384 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.442359 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kube-api-access-vq469" (OuterVolumeSpecName: "kube-api-access-vq469") pod "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" (UID: "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894"). InnerVolumeSpecName "kube-api-access-vq469". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:12:38.442497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.442364 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" (UID: "7929c1dc-5d4e-4e19-b3c1-858bbc4fc894"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:12:38.541162 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.541053 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:12:38.541162 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.541102 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:12:38.541162 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.541121 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vq469\" (UniqueName: \"kubernetes.io/projected/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894-kube-api-access-vq469\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:12:38.873959 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.873879 2559 generic.go:358] "Generic (PLEG): container finished" podID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerID="54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea" exitCode=0 Apr 25 00:12:38.874103 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.873953 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" Apr 25 00:12:38.874103 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.873964 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerDied","Data":"54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea"} Apr 25 00:12:38.874103 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.874008 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq" event={"ID":"7929c1dc-5d4e-4e19-b3c1-858bbc4fc894","Type":"ContainerDied","Data":"2918350e4a842559580755640dd2c9e5c4e00cb8ac03f28804ef70c05c33f219"} Apr 25 00:12:38.874103 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.874028 2559 scope.go:117] "RemoveContainer" containerID="54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea" Apr 25 00:12:38.882690 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.882674 2559 scope.go:117] "RemoveContainer" containerID="08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f" Apr 25 00:12:38.895055 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.895030 2559 scope.go:117] "RemoveContainer" containerID="e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055" Apr 25 00:12:38.896330 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.896307 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq"] Apr 25 00:12:38.899301 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.899281 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7bdf6d6fpqhcq"] Apr 25 00:12:38.902892 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.902872 2559 scope.go:117] "RemoveContainer" containerID="54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea" Apr 25 00:12:38.903115 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:12:38.903096 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea\": container with ID starting with 54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea not found: ID does not exist" containerID="54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea" Apr 25 00:12:38.903167 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.903125 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea"} err="failed to get container status \"54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea\": rpc error: code = NotFound desc = could not find container \"54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea\": container with ID starting with 54bb75fcd261277ff52f47e5b1b23a6be6721bebeebfd72b31c74a18473764ea not found: ID does not exist" Apr 25 00:12:38.903167 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.903144 2559 scope.go:117] "RemoveContainer" containerID="08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f" Apr 25 00:12:38.903405 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:12:38.903385 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f\": container with ID starting with 08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f not found: ID does not exist" containerID="08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f" Apr 25 00:12:38.903537 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.903409 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f"} err="failed to get container status \"08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f\": rpc error: code = NotFound desc = could not find container \"08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f\": container with ID starting with 08ee4e5276f37fa0807a6486bdfadfb816c0732be59b5f38be53a5d1776d027f not found: ID does not exist" Apr 25 00:12:38.903537 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.903428 2559 scope.go:117] "RemoveContainer" containerID="e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055" Apr 25 00:12:38.903692 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:12:38.903677 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055\": container with ID starting with e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055 not found: ID does not exist" containerID="e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055" Apr 25 00:12:38.903733 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.903696 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055"} err="failed to get container status \"e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055\": rpc error: code = NotFound desc = could not find container \"e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055\": container with ID starting with e711bbd94eeb5ed0fd0c247e87d1501d7debea2b208ab8403ce5a9395332d055 not found: ID does not exist" Apr 25 00:12:38.936683 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:38.936645 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" path="/var/lib/kubelet/pods/7929c1dc-5d4e-4e19-b3c1-858bbc4fc894/volumes" Apr 25 00:12:56.139852 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.139814 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5"] Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140237 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="main" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140254 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="main" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140278 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="storage-initializer" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140286 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="storage-initializer" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140313 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="tokenizer" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140323 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="tokenizer" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140401 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="main" Apr 25 00:12:56.140445 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.140413 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="7929c1dc-5d4e-4e19-b3c1-858bbc4fc894" containerName="tokenizer" Apr 25 00:12:56.145270 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.145249 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.147662 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.147642 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:12:56.147761 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.147702 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 25 00:12:56.147761 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.147702 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-bvttf\"" Apr 25 00:12:56.150871 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.150850 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5"] Apr 25 00:12:56.191728 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.191682 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55847d2c-0248-47eb-9023-5a8f58791143-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.191923 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.191769 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.191923 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.191802 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.191923 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.191823 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59l5b\" (UniqueName: \"kubernetes.io/projected/55847d2c-0248-47eb-9023-5a8f58791143-kube-api-access-59l5b\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.191923 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.191844 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.191923 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.191915 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.292903 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.292864 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55847d2c-0248-47eb-9023-5a8f58791143-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293071 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.292921 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293071 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.292960 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293071 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.292986 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59l5b\" (UniqueName: \"kubernetes.io/projected/55847d2c-0248-47eb-9023-5a8f58791143-kube-api-access-59l5b\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293071 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.293017 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293071 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.293063 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293365 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.293342 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293432 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.293366 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293513 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.293443 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.293513 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.293439 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.295401 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.295380 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55847d2c-0248-47eb-9023-5a8f58791143-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.302033 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.302014 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59l5b\" (UniqueName: \"kubernetes.io/projected/55847d2c-0248-47eb-9023-5a8f58791143-kube-api-access-59l5b\") pod \"router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.455679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.455631 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:56.585477 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.585436 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5"] Apr 25 00:12:56.587867 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:12:56.587836 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55847d2c_0248_47eb_9023_5a8f58791143.slice/crio-1b6cb9fb84b246df2ff7cfd5fad41451e1b4f581088b760e9e07bcc0fc55985b WatchSource:0}: Error finding container 1b6cb9fb84b246df2ff7cfd5fad41451e1b4f581088b760e9e07bcc0fc55985b: Status 404 returned error can't find the container with id 1b6cb9fb84b246df2ff7cfd5fad41451e1b4f581088b760e9e07bcc0fc55985b Apr 25 00:12:56.589717 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.589699 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:12:56.942990 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.942951 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerStarted","Data":"b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec"} Apr 25 00:12:56.943171 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:56.942998 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerStarted","Data":"1b6cb9fb84b246df2ff7cfd5fad41451e1b4f581088b760e9e07bcc0fc55985b"} Apr 25 00:12:57.947988 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:57.947953 2559 generic.go:358] "Generic (PLEG): container finished" podID="55847d2c-0248-47eb-9023-5a8f58791143" containerID="b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec" exitCode=0 Apr 25 00:12:57.948381 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:57.948042 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerDied","Data":"b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec"} Apr 25 00:12:58.952965 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:58.952927 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerStarted","Data":"a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd"} Apr 25 00:12:58.952965 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:58.952965 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerStarted","Data":"d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d"} Apr 25 00:12:58.953399 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:58.953212 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:12:58.972635 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:12:58.972585 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" podStartSLOduration=2.972570192 podStartE2EDuration="2.972570192s" podCreationTimestamp="2026-04-25 00:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:12:58.97067769 +0000 UTC m=+1158.612592567" watchObservedRunningTime="2026-04-25 00:12:58.972570192 +0000 UTC m=+1158.614485044" Apr 25 00:13:06.456696 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:06.456592 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:13:06.456696 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:06.456639 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:13:06.459492 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:06.459449 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:13:06.980889 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:06.980863 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:13:28.987624 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:28.987592 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:13:40.935059 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:40.935030 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:13:40.936994 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:40.936970 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:13:40.941513 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:40.941488 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:13:40.942658 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:13:40.942644 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:16:02.473361 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:02.473266 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5"] Apr 25 00:16:02.473960 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:02.473596 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="main" containerID="cri-o://d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d" gracePeriod=30 Apr 25 00:16:02.473960 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:02.473637 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="tokenizer" containerID="cri-o://a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd" gracePeriod=30 Apr 25 00:16:03.574415 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.574385 2559 generic.go:358] "Generic (PLEG): container finished" podID="55847d2c-0248-47eb-9023-5a8f58791143" containerID="d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d" exitCode=0 Apr 25 00:16:03.574807 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.574482 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerDied","Data":"d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d"} Apr 25 00:16:03.721700 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.721677 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:16:03.842312 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842228 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-uds\") pod \"55847d2c-0248-47eb-9023-5a8f58791143\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " Apr 25 00:16:03.842312 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842264 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59l5b\" (UniqueName: \"kubernetes.io/projected/55847d2c-0248-47eb-9023-5a8f58791143-kube-api-access-59l5b\") pod \"55847d2c-0248-47eb-9023-5a8f58791143\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " Apr 25 00:16:03.842312 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842287 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-cache\") pod \"55847d2c-0248-47eb-9023-5a8f58791143\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " Apr 25 00:16:03.842312 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842313 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-kserve-provision-location\") pod \"55847d2c-0248-47eb-9023-5a8f58791143\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " Apr 25 00:16:03.842696 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842423 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-tmp\") pod \"55847d2c-0248-47eb-9023-5a8f58791143\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " Apr 25 00:16:03.842696 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842453 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55847d2c-0248-47eb-9023-5a8f58791143-tls-certs\") pod \"55847d2c-0248-47eb-9023-5a8f58791143\" (UID: \"55847d2c-0248-47eb-9023-5a8f58791143\") " Apr 25 00:16:03.842696 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842585 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "55847d2c-0248-47eb-9023-5a8f58791143" (UID: "55847d2c-0248-47eb-9023-5a8f58791143"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:16:03.842696 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842596 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "55847d2c-0248-47eb-9023-5a8f58791143" (UID: "55847d2c-0248-47eb-9023-5a8f58791143"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:16:03.842940 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842740 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "55847d2c-0248-47eb-9023-5a8f58791143" (UID: "55847d2c-0248-47eb-9023-5a8f58791143"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:16:03.842940 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842773 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:16:03.842940 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.842796 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:16:03.843182 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.843158 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "55847d2c-0248-47eb-9023-5a8f58791143" (UID: "55847d2c-0248-47eb-9023-5a8f58791143"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:16:03.844502 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.844446 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55847d2c-0248-47eb-9023-5a8f58791143-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "55847d2c-0248-47eb-9023-5a8f58791143" (UID: "55847d2c-0248-47eb-9023-5a8f58791143"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:16:03.844502 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.844496 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55847d2c-0248-47eb-9023-5a8f58791143-kube-api-access-59l5b" (OuterVolumeSpecName: "kube-api-access-59l5b") pod "55847d2c-0248-47eb-9023-5a8f58791143" (UID: "55847d2c-0248-47eb-9023-5a8f58791143"). InnerVolumeSpecName "kube-api-access-59l5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:16:03.943578 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.943546 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:16:03.943578 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.943572 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55847d2c-0248-47eb-9023-5a8f58791143-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:16:03.943578 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.943581 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-59l5b\" (UniqueName: \"kubernetes.io/projected/55847d2c-0248-47eb-9023-5a8f58791143-kube-api-access-59l5b\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:16:03.943809 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:03.943591 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55847d2c-0248-47eb-9023-5a8f58791143-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:16:04.579741 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.579706 2559 generic.go:358] "Generic (PLEG): container finished" podID="55847d2c-0248-47eb-9023-5a8f58791143" containerID="a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd" exitCode=0 Apr 25 00:16:04.580218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.579806 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" Apr 25 00:16:04.580218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.579806 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerDied","Data":"a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd"} Apr 25 00:16:04.580218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.579852 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5" event={"ID":"55847d2c-0248-47eb-9023-5a8f58791143","Type":"ContainerDied","Data":"1b6cb9fb84b246df2ff7cfd5fad41451e1b4f581088b760e9e07bcc0fc55985b"} Apr 25 00:16:04.580218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.579874 2559 scope.go:117] "RemoveContainer" containerID="a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd" Apr 25 00:16:04.589122 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.589105 2559 scope.go:117] "RemoveContainer" containerID="d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d" Apr 25 00:16:04.596206 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.596189 2559 scope.go:117] "RemoveContainer" containerID="b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec" Apr 25 00:16:04.605475 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.602226 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5"] Apr 25 00:16:04.606104 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.606081 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-bff5c8885-952v5"] Apr 25 00:16:04.606747 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.606717 2559 scope.go:117] "RemoveContainer" containerID="a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd" Apr 25 00:16:04.607007 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:16:04.606990 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd\": container with ID starting with a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd not found: ID does not exist" containerID="a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd" Apr 25 00:16:04.607057 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.607019 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd"} err="failed to get container status \"a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd\": rpc error: code = NotFound desc = could not find container \"a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd\": container with ID starting with a90e21445669baf72f19a5c5625fe154bfcf1cb85261bc934dfdbb2e3358c6cd not found: ID does not exist" Apr 25 00:16:04.607057 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.607039 2559 scope.go:117] "RemoveContainer" containerID="d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d" Apr 25 00:16:04.607303 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:16:04.607284 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d\": container with ID starting with d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d not found: ID does not exist" containerID="d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d" Apr 25 00:16:04.607357 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.607310 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d"} err="failed to get container status \"d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d\": rpc error: code = NotFound desc = could not find container \"d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d\": container with ID starting with d27332c6c2ac64c2e2e9ce7ec1fc325220bd57688e90ca6c27dd84ae42be9d4d not found: ID does not exist" Apr 25 00:16:04.607357 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.607325 2559 scope.go:117] "RemoveContainer" containerID="b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec" Apr 25 00:16:04.607565 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:16:04.607550 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec\": container with ID starting with b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec not found: ID does not exist" containerID="b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec" Apr 25 00:16:04.607610 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.607569 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec"} err="failed to get container status \"b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec\": rpc error: code = NotFound desc = could not find container \"b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec\": container with ID starting with b06ed95aa07230fac731eb4d28bea4cd5fb4d2a46e5e10d759c2ca1d000076ec not found: ID does not exist" Apr 25 00:16:04.936748 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:04.936717 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55847d2c-0248-47eb-9023-5a8f58791143" path="/var/lib/kubelet/pods/55847d2c-0248-47eb-9023-5a8f58791143/volumes" Apr 25 00:16:24.546913 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.546877 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7"] Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547184 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="tokenizer" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547196 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="tokenizer" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547214 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="main" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547219 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="main" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547226 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="storage-initializer" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547231 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="storage-initializer" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547280 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="tokenizer" Apr 25 00:16:24.547325 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.547289 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="55847d2c-0248-47eb-9023-5a8f58791143" containerName="main" Apr 25 00:16:24.550110 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.550091 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.552676 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.552652 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-hfrd8\"" Apr 25 00:16:24.552785 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.552655 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 25 00:16:24.552785 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.552667 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:16:24.567032 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.566998 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7"] Apr 25 00:16:24.713840 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.713797 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.713840 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.713838 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.714050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.713897 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.714050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.713922 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.714050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.713988 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.714152 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.714056 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m57h2\" (UniqueName: \"kubernetes.io/projected/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kube-api-access-m57h2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815015 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.814916 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815015 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.814982 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815242 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815030 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m57h2\" (UniqueName: \"kubernetes.io/projected/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kube-api-access-m57h2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815242 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815072 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815242 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815100 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815242 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815143 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815449 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815318 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815449 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815405 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815606 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815488 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.815606 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.815552 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.817576 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.817558 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.822816 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.822794 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m57h2\" (UniqueName: \"kubernetes.io/projected/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kube-api-access-m57h2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.859868 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.859830 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:24.992734 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:24.992707 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7"] Apr 25 00:16:24.995128 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:16:24.995099 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2373e0d0_5d63_46cd_b7f1_ee0d3c004e1f.slice/crio-896344f34354a0fe3d38c5e5a26d1b9d0eec247bb50a36dbda380bd9db9c1bfb WatchSource:0}: Error finding container 896344f34354a0fe3d38c5e5a26d1b9d0eec247bb50a36dbda380bd9db9c1bfb: Status 404 returned error can't find the container with id 896344f34354a0fe3d38c5e5a26d1b9d0eec247bb50a36dbda380bd9db9c1bfb Apr 25 00:16:25.647673 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:25.647640 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerStarted","Data":"3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac"} Apr 25 00:16:25.648054 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:25.647679 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerStarted","Data":"896344f34354a0fe3d38c5e5a26d1b9d0eec247bb50a36dbda380bd9db9c1bfb"} Apr 25 00:16:26.653151 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:26.653115 2559 generic.go:358] "Generic (PLEG): container finished" podID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerID="3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac" exitCode=0 Apr 25 00:16:26.653571 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:26.653198 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerDied","Data":"3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac"} Apr 25 00:16:27.658590 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:27.658556 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerStarted","Data":"163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330"} Apr 25 00:16:27.658590 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:27.658596 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerStarted","Data":"eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561"} Apr 25 00:16:27.659011 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:27.658690 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:27.680505 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:27.680442 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" podStartSLOduration=3.680428592 podStartE2EDuration="3.680428592s" podCreationTimestamp="2026-04-25 00:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:16:27.678228281 +0000 UTC m=+1367.320143156" watchObservedRunningTime="2026-04-25 00:16:27.680428592 +0000 UTC m=+1367.322343464" Apr 25 00:16:34.860447 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:34.860403 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:34.860447 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:34.860452 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:34.862867 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:34.862841 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:35.687753 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:35.687722 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:16:56.691213 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:16:56.691184 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:18:40.956690 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:18:40.956653 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:18:40.958369 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:18:40.958348 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:18:40.965156 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:18:40.965136 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:18:40.966977 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:18:40.966959 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:23:40.982721 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:23:40.982691 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:23:40.985297 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:23:40.985277 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:23:40.988891 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:23:40.988870 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:23:40.991057 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:23:40.991042 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:28:41.008788 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:28:41.008757 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:28:41.010362 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:28:41.010341 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:28:41.014573 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:28:41.014554 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:28:41.016143 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:28:41.016127 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:30:56.508052 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.508015 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s"] Apr 25 00:30:56.511425 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.511407 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.513976 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.513952 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-wtvbs\"" Apr 25 00:30:56.514065 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.514008 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 25 00:30:56.526444 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.526419 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s"] Apr 25 00:30:56.598222 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.598180 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.598222 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.598221 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.598478 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.598253 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.598478 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.598336 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.598478 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.598389 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nvz\" (UniqueName: \"kubernetes.io/projected/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kube-api-access-m7nvz\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.598478 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.598436 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699162 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699129 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nvz\" (UniqueName: \"kubernetes.io/projected/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kube-api-access-m7nvz\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699351 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699179 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699351 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699214 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699351 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699233 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699351 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699258 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699351 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699283 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699670 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699650 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699744 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699720 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699805 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699754 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.699805 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.699763 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.701828 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.701800 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.708863 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.708838 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nvz\" (UniqueName: \"kubernetes.io/projected/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kube-api-access-m7nvz\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:56.820728 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:56.820631 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:57.153631 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:57.153604 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s"] Apr 25 00:30:57.155697 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:30:57.155666 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5d2ee26_35e0_496b_bd80_4d3b59c7248e.slice/crio-c81bd41acebba961ac553525c7da364572b29b710067c85e9223e5dbb43c1293 WatchSource:0}: Error finding container c81bd41acebba961ac553525c7da364572b29b710067c85e9223e5dbb43c1293: Status 404 returned error can't find the container with id c81bd41acebba961ac553525c7da364572b29b710067c85e9223e5dbb43c1293 Apr 25 00:30:57.157511 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:57.157493 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:30:57.635996 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:57.635961 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerStarted","Data":"2240d55fa482a8231ff74a6a19691e627b2c19d66c60783069a9ef3d53896827"} Apr 25 00:30:57.635996 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:57.636000 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerStarted","Data":"c81bd41acebba961ac553525c7da364572b29b710067c85e9223e5dbb43c1293"} Apr 25 00:30:58.640966 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:58.640934 2559 generic.go:358] "Generic (PLEG): container finished" podID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerID="2240d55fa482a8231ff74a6a19691e627b2c19d66c60783069a9ef3d53896827" exitCode=0 Apr 25 00:30:58.641352 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:58.641012 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerDied","Data":"2240d55fa482a8231ff74a6a19691e627b2c19d66c60783069a9ef3d53896827"} Apr 25 00:30:59.647557 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:59.647472 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerStarted","Data":"7e01d6f73e2ab751381ca49054530c4bc76eeaf0f686ddac58cb2f580310a42d"} Apr 25 00:30:59.647557 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:59.647515 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerStarted","Data":"934ce1ece6223fdf737e5d4fd7bd225f4f360ac7a13a7d0d34265357a3237d29"} Apr 25 00:30:59.647977 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:59.647661 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:30:59.668731 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:30:59.668679 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" podStartSLOduration=3.668662266 podStartE2EDuration="3.668662266s" podCreationTimestamp="2026-04-25 00:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:30:59.666768515 +0000 UTC m=+2239.308683390" watchObservedRunningTime="2026-04-25 00:30:59.668662266 +0000 UTC m=+2239.310577169" Apr 25 00:31:06.821327 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:06.821295 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:31:06.821327 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:06.821334 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:31:06.824246 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:06.824218 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:31:06.983367 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:06.983334 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7"] Apr 25 00:31:06.983716 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:06.983682 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="main" containerID="cri-o://eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561" gracePeriod=30 Apr 25 00:31:06.983819 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:06.983742 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="tokenizer" containerID="cri-o://163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330" gracePeriod=30 Apr 25 00:31:07.676734 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:07.676698 2559 generic.go:358] "Generic (PLEG): container finished" podID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerID="eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561" exitCode=0 Apr 25 00:31:07.676920 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:07.676766 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerDied","Data":"eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561"} Apr 25 00:31:07.678184 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:07.678166 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:31:08.237852 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.237826 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:31:08.405359 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405330 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-tmp\") pod \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " Apr 25 00:31:08.405565 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405451 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tls-certs\") pod \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " Apr 25 00:31:08.405565 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405516 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-uds\") pod \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " Apr 25 00:31:08.405565 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405538 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-cache\") pod \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " Apr 25 00:31:08.405722 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405565 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kserve-provision-location\") pod \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " Apr 25 00:31:08.405722 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405590 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m57h2\" (UniqueName: \"kubernetes.io/projected/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kube-api-access-m57h2\") pod \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\" (UID: \"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f\") " Apr 25 00:31:08.405722 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405698 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" (UID: "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:31:08.405846 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405816 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" (UID: "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:31:08.405846 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405826 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" (UID: "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:31:08.405981 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405963 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:31:08.406047 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.405989 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:31:08.406047 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.406000 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:31:08.406287 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.406258 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" (UID: "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:31:08.407638 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.407617 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" (UID: "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:31:08.407828 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.407800 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kube-api-access-m57h2" (OuterVolumeSpecName: "kube-api-access-m57h2") pod "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" (UID: "2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f"). InnerVolumeSpecName "kube-api-access-m57h2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:31:08.506441 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.506404 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:31:08.506441 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.506437 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:31:08.506441 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.506449 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m57h2\" (UniqueName: \"kubernetes.io/projected/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f-kube-api-access-m57h2\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:31:08.682765 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.682681 2559 generic.go:358] "Generic (PLEG): container finished" podID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerID="163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330" exitCode=0 Apr 25 00:31:08.682908 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.682769 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" Apr 25 00:31:08.682908 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.682771 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerDied","Data":"163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330"} Apr 25 00:31:08.682908 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.682810 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7" event={"ID":"2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f","Type":"ContainerDied","Data":"896344f34354a0fe3d38c5e5a26d1b9d0eec247bb50a36dbda380bd9db9c1bfb"} Apr 25 00:31:08.682908 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.682825 2559 scope.go:117] "RemoveContainer" containerID="163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330" Apr 25 00:31:08.690983 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.690964 2559 scope.go:117] "RemoveContainer" containerID="eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561" Apr 25 00:31:08.698075 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.698059 2559 scope.go:117] "RemoveContainer" containerID="3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac" Apr 25 00:31:08.704568 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.704544 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7"] Apr 25 00:31:08.705503 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.705489 2559 scope.go:117] "RemoveContainer" containerID="163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330" Apr 25 00:31:08.705744 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:31:08.705727 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330\": container with ID starting with 163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330 not found: ID does not exist" containerID="163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330" Apr 25 00:31:08.705787 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.705754 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330"} err="failed to get container status \"163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330\": rpc error: code = NotFound desc = could not find container \"163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330\": container with ID starting with 163cb99cb4b22a19e0f2782edf18449c9ff67c81d3decea10712ff466f883330 not found: ID does not exist" Apr 25 00:31:08.705787 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.705771 2559 scope.go:117] "RemoveContainer" containerID="eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561" Apr 25 00:31:08.705976 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:31:08.705960 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561\": container with ID starting with eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561 not found: ID does not exist" containerID="eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561" Apr 25 00:31:08.706017 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.705981 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561"} err="failed to get container status \"eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561\": rpc error: code = NotFound desc = could not find container \"eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561\": container with ID starting with eea5788f9b1e9f0f6d8e25e17691ba700ead1512821567551be36207ecf0a561 not found: ID does not exist" Apr 25 00:31:08.706017 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.705995 2559 scope.go:117] "RemoveContainer" containerID="3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac" Apr 25 00:31:08.706179 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:31:08.706163 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac\": container with ID starting with 3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac not found: ID does not exist" containerID="3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac" Apr 25 00:31:08.706218 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.706182 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac"} err="failed to get container status \"3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac\": rpc error: code = NotFound desc = could not find container \"3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac\": container with ID starting with 3c2f31ee91590b4888a78b17c4006a043fe85239194915763facdbd705bd85ac not found: ID does not exist" Apr 25 00:31:08.710110 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.710085 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewgsc7"] Apr 25 00:31:08.936771 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:08.936692 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" path="/var/lib/kubelet/pods/2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f/volumes" Apr 25 00:31:24.929790 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.929753 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5"] Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930104 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="tokenizer" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930117 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="tokenizer" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930135 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="storage-initializer" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930141 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="storage-initializer" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930149 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="main" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930157 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="main" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930208 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="tokenizer" Apr 25 00:31:24.930386 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.930216 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="2373e0d0-5d63-46cd-b7f1-ee0d3c004e1f" containerName="main" Apr 25 00:31:24.933408 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.933383 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:24.935886 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.935862 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 25 00:31:24.935886 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.935882 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-8zk7q\"" Apr 25 00:31:24.945067 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:24.945044 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5"] Apr 25 00:31:25.036454 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.036411 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvkhc\" (UniqueName: \"kubernetes.io/projected/4466daf8-8260-4f8c-a22a-6d108d80d151-kube-api-access-kvkhc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.036644 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.036485 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.036644 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.036540 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.036644 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.036583 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.036644 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.036613 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4466daf8-8260-4f8c-a22a-6d108d80d151-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.036785 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.036657 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137264 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137230 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137264 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137266 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137493 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137291 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4466daf8-8260-4f8c-a22a-6d108d80d151-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137493 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137312 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137493 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137344 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvkhc\" (UniqueName: \"kubernetes.io/projected/4466daf8-8260-4f8c-a22a-6d108d80d151-kube-api-access-kvkhc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137493 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137375 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137690 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137668 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137770 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137716 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137770 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137750 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.137868 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.137781 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.139736 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.139721 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4466daf8-8260-4f8c-a22a-6d108d80d151-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.146132 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.146107 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvkhc\" (UniqueName: \"kubernetes.io/projected/4466daf8-8260-4f8c-a22a-6d108d80d151-kube-api-access-kvkhc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.245575 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.245479 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:25.373105 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.373047 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5"] Apr 25 00:31:25.375519 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:31:25.375490 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4466daf8_8260_4f8c_a22a_6d108d80d151.slice/crio-babad2b62574bc8edce4018f6c4fa4723f2bd7ca12ca19cf895ddf5197d78a80 WatchSource:0}: Error finding container babad2b62574bc8edce4018f6c4fa4723f2bd7ca12ca19cf895ddf5197d78a80: Status 404 returned error can't find the container with id babad2b62574bc8edce4018f6c4fa4723f2bd7ca12ca19cf895ddf5197d78a80 Apr 25 00:31:25.745173 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.745138 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerStarted","Data":"e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830"} Apr 25 00:31:25.745173 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:25.745179 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerStarted","Data":"babad2b62574bc8edce4018f6c4fa4723f2bd7ca12ca19cf895ddf5197d78a80"} Apr 25 00:31:26.750124 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:26.750088 2559 generic.go:358] "Generic (PLEG): container finished" podID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerID="e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830" exitCode=0 Apr 25 00:31:26.750519 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:26.750177 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerDied","Data":"e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830"} Apr 25 00:31:27.755618 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:27.755582 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerStarted","Data":"4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1"} Apr 25 00:31:27.755618 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:27.755621 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerStarted","Data":"5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6"} Apr 25 00:31:27.756057 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:27.755718 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:27.778583 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:27.778524 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" podStartSLOduration=3.778507038 podStartE2EDuration="3.778507038s" podCreationTimestamp="2026-04-25 00:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:31:27.777309626 +0000 UTC m=+2267.419224499" watchObservedRunningTime="2026-04-25 00:31:27.778507038 +0000 UTC m=+2267.420421912" Apr 25 00:31:28.684742 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:28.684715 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:31:35.246138 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:35.246096 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:35.246138 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:35.246145 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:35.248603 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:35.248573 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:35.791772 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:35.791741 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:31:56.792592 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:31:56.792565 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:33:41.034346 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:33:41.034316 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:33:41.036716 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:33:41.036689 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:33:41.041533 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:33:41.041511 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:33:41.043161 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:33:41.043145 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:35:05.178666 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:05.178631 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s"] Apr 25 00:35:05.179124 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:05.178969 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="main" containerID="cri-o://934ce1ece6223fdf737e5d4fd7bd225f4f360ac7a13a7d0d34265357a3237d29" gracePeriod=30 Apr 25 00:35:05.179124 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:05.179004 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="tokenizer" containerID="cri-o://7e01d6f73e2ab751381ca49054530c4bc76eeaf0f686ddac58cb2f580310a42d" gracePeriod=30 Apr 25 00:35:05.511131 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:05.511096 2559 generic.go:358] "Generic (PLEG): container finished" podID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerID="934ce1ece6223fdf737e5d4fd7bd225f4f360ac7a13a7d0d34265357a3237d29" exitCode=0 Apr 25 00:35:05.511301 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:05.511169 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerDied","Data":"934ce1ece6223fdf737e5d4fd7bd225f4f360ac7a13a7d0d34265357a3237d29"} Apr 25 00:35:06.523127 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.523094 2559 generic.go:358] "Generic (PLEG): container finished" podID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerID="7e01d6f73e2ab751381ca49054530c4bc76eeaf0f686ddac58cb2f580310a42d" exitCode=0 Apr 25 00:35:06.523428 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.523153 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerDied","Data":"7e01d6f73e2ab751381ca49054530c4bc76eeaf0f686ddac58cb2f580310a42d"} Apr 25 00:35:06.523428 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.523194 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" event={"ID":"e5d2ee26-35e0-496b-bd80-4d3b59c7248e","Type":"ContainerDied","Data":"c81bd41acebba961ac553525c7da364572b29b710067c85e9223e5dbb43c1293"} Apr 25 00:35:06.523428 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.523207 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c81bd41acebba961ac553525c7da364572b29b710067c85e9223e5dbb43c1293" Apr 25 00:35:06.527374 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.527355 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:35:06.616827 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.616730 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tls-certs\") pod \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " Apr 25 00:35:06.616827 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.616773 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kserve-provision-location\") pod \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " Apr 25 00:35:06.616827 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.616794 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-uds\") pod \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " Apr 25 00:35:06.617116 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.616848 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7nvz\" (UniqueName: \"kubernetes.io/projected/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kube-api-access-m7nvz\") pod \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " Apr 25 00:35:06.617116 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.616905 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-tmp\") pod \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " Apr 25 00:35:06.617116 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.616933 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-cache\") pod \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\" (UID: \"e5d2ee26-35e0-496b-bd80-4d3b59c7248e\") " Apr 25 00:35:06.617282 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.617158 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e5d2ee26-35e0-496b-bd80-4d3b59c7248e" (UID: "e5d2ee26-35e0-496b-bd80-4d3b59c7248e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:06.617282 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.617259 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e5d2ee26-35e0-496b-bd80-4d3b59c7248e" (UID: "e5d2ee26-35e0-496b-bd80-4d3b59c7248e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:06.617591 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.617570 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e5d2ee26-35e0-496b-bd80-4d3b59c7248e" (UID: "e5d2ee26-35e0-496b-bd80-4d3b59c7248e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:06.617748 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.617722 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5d2ee26-35e0-496b-bd80-4d3b59c7248e" (UID: "e5d2ee26-35e0-496b-bd80-4d3b59c7248e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:06.619302 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.619279 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e5d2ee26-35e0-496b-bd80-4d3b59c7248e" (UID: "e5d2ee26-35e0-496b-bd80-4d3b59c7248e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:35:06.619399 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.619309 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kube-api-access-m7nvz" (OuterVolumeSpecName: "kube-api-access-m7nvz") pod "e5d2ee26-35e0-496b-bd80-4d3b59c7248e" (UID: "e5d2ee26-35e0-496b-bd80-4d3b59c7248e"). InnerVolumeSpecName "kube-api-access-m7nvz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:35:06.718497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.718440 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:35:06.718497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.718495 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:35:06.718497 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.718507 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:35:06.718787 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.718517 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7nvz\" (UniqueName: \"kubernetes.io/projected/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-kube-api-access-m7nvz\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:35:06.718787 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.718525 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:35:06.718787 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:06.718534 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2ee26-35e0-496b-bd80-4d3b59c7248e-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:35:07.526763 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:07.526732 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s" Apr 25 00:35:07.544908 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:07.544878 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s"] Apr 25 00:35:07.548593 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:07.548560 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew6p5s"] Apr 25 00:35:08.939207 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:08.939172 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" path="/var/lib/kubelet/pods/e5d2ee26-35e0-496b-bd80-4d3b59c7248e/volumes" Apr 25 00:35:16.537861 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.537829 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz"] Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538152 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="tokenizer" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538163 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="tokenizer" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538179 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="storage-initializer" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538184 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="storage-initializer" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538196 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="main" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538202 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="main" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538249 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="main" Apr 25 00:35:16.538333 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.538257 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d2ee26-35e0-496b-bd80-4d3b59c7248e" containerName="tokenizer" Apr 25 00:35:16.541098 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.541079 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.543652 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.543632 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-5bnwm\"" Apr 25 00:35:16.543755 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.543739 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 25 00:35:16.551653 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.551633 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz"] Apr 25 00:35:16.605748 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.605710 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.605748 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.605749 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.605962 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.605778 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.605962 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.605796 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.605962 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.605868 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc5jc\" (UniqueName: \"kubernetes.io/projected/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kube-api-access-vc5jc\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.605962 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.605909 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707377 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707340 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707377 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707379 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707631 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707406 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707631 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707523 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707631 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707563 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc5jc\" (UniqueName: \"kubernetes.io/projected/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kube-api-access-vc5jc\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707631 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707616 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707835 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707820 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707895 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707838 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707944 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707888 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.707982 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.707967 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.710088 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.710060 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.719247 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.719216 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc5jc\" (UniqueName: \"kubernetes.io/projected/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kube-api-access-vc5jc\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.851622 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.851543 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:16.981337 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:16.981307 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz"] Apr 25 00:35:16.983154 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:35:16.983126 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde6ec62_36d8_4fc0_a97a_68885ee9d7eb.slice/crio-9e1fcd638f5b6310000c0309f731a2157658593b41dca234d0cd9e275d48574b WatchSource:0}: Error finding container 9e1fcd638f5b6310000c0309f731a2157658593b41dca234d0cd9e275d48574b: Status 404 returned error can't find the container with id 9e1fcd638f5b6310000c0309f731a2157658593b41dca234d0cd9e275d48574b Apr 25 00:35:17.565043 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:17.564995 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerStarted","Data":"2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944"} Apr 25 00:35:17.565043 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:17.565044 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerStarted","Data":"9e1fcd638f5b6310000c0309f731a2157658593b41dca234d0cd9e275d48574b"} Apr 25 00:35:18.569951 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:18.569911 2559 generic.go:358] "Generic (PLEG): container finished" podID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerID="2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944" exitCode=0 Apr 25 00:35:18.570329 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:18.569990 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerDied","Data":"2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944"} Apr 25 00:35:19.576190 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:19.576153 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerStarted","Data":"5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462"} Apr 25 00:35:19.576190 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:19.576191 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerStarted","Data":"bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9"} Apr 25 00:35:19.576816 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:19.576318 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:19.597798 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:19.597753 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" podStartSLOduration=3.597739043 podStartE2EDuration="3.597739043s" podCreationTimestamp="2026-04-25 00:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:35:19.596210642 +0000 UTC m=+2499.238125514" watchObservedRunningTime="2026-04-25 00:35:19.597739043 +0000 UTC m=+2499.239653915" Apr 25 00:35:26.851843 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:26.851806 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:26.851843 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:26.851849 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:26.854524 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:26.854496 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:27.607000 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:27.606976 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:35:48.610597 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:35:48.610564 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:36:43.506300 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:43.506265 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5"] Apr 25 00:36:43.506909 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:43.506742 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="main" containerID="cri-o://5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6" gracePeriod=30 Apr 25 00:36:43.506909 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:43.506801 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="tokenizer" containerID="cri-o://4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1" gracePeriod=30 Apr 25 00:36:43.876778 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:43.876689 2559 generic.go:358] "Generic (PLEG): container finished" podID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerID="5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6" exitCode=0 Apr 25 00:36:43.876778 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:43.876763 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerDied","Data":"5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6"} Apr 25 00:36:44.754999 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.754975 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:36:44.872255 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872165 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-cache\") pod \"4466daf8-8260-4f8c-a22a-6d108d80d151\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " Apr 25 00:36:44.872255 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872206 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-uds\") pod \"4466daf8-8260-4f8c-a22a-6d108d80d151\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " Apr 25 00:36:44.872255 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872237 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvkhc\" (UniqueName: \"kubernetes.io/projected/4466daf8-8260-4f8c-a22a-6d108d80d151-kube-api-access-kvkhc\") pod \"4466daf8-8260-4f8c-a22a-6d108d80d151\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " Apr 25 00:36:44.872255 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872253 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-tmp\") pod \"4466daf8-8260-4f8c-a22a-6d108d80d151\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " Apr 25 00:36:44.872703 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872340 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-kserve-provision-location\") pod \"4466daf8-8260-4f8c-a22a-6d108d80d151\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " Apr 25 00:36:44.872703 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872427 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4466daf8-8260-4f8c-a22a-6d108d80d151-tls-certs\") pod \"4466daf8-8260-4f8c-a22a-6d108d80d151\" (UID: \"4466daf8-8260-4f8c-a22a-6d108d80d151\") " Apr 25 00:36:44.872703 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872512 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4466daf8-8260-4f8c-a22a-6d108d80d151" (UID: "4466daf8-8260-4f8c-a22a-6d108d80d151"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:36:44.872703 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872535 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4466daf8-8260-4f8c-a22a-6d108d80d151" (UID: "4466daf8-8260-4f8c-a22a-6d108d80d151"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:36:44.872703 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872628 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4466daf8-8260-4f8c-a22a-6d108d80d151" (UID: "4466daf8-8260-4f8c-a22a-6d108d80d151"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:36:44.872870 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872774 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:36:44.872870 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872787 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:36:44.872870 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.872795 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:36:44.873189 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.873163 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4466daf8-8260-4f8c-a22a-6d108d80d151" (UID: "4466daf8-8260-4f8c-a22a-6d108d80d151"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:36:44.874500 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.874477 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4466daf8-8260-4f8c-a22a-6d108d80d151-kube-api-access-kvkhc" (OuterVolumeSpecName: "kube-api-access-kvkhc") pod "4466daf8-8260-4f8c-a22a-6d108d80d151" (UID: "4466daf8-8260-4f8c-a22a-6d108d80d151"). InnerVolumeSpecName "kube-api-access-kvkhc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:36:44.874557 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.874533 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4466daf8-8260-4f8c-a22a-6d108d80d151-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4466daf8-8260-4f8c-a22a-6d108d80d151" (UID: "4466daf8-8260-4f8c-a22a-6d108d80d151"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:36:44.882580 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.882555 2559 generic.go:358] "Generic (PLEG): container finished" podID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerID="4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1" exitCode=0 Apr 25 00:36:44.882697 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.882640 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerDied","Data":"4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1"} Apr 25 00:36:44.882697 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.882656 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" Apr 25 00:36:44.882697 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.882678 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5" event={"ID":"4466daf8-8260-4f8c-a22a-6d108d80d151","Type":"ContainerDied","Data":"babad2b62574bc8edce4018f6c4fa4723f2bd7ca12ca19cf895ddf5197d78a80"} Apr 25 00:36:44.882697 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.882695 2559 scope.go:117] "RemoveContainer" containerID="4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1" Apr 25 00:36:44.891958 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.891943 2559 scope.go:117] "RemoveContainer" containerID="5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6" Apr 25 00:36:44.898939 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.898919 2559 scope.go:117] "RemoveContainer" containerID="e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830" Apr 25 00:36:44.904630 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.904610 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5"] Apr 25 00:36:44.907188 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.907168 2559 scope.go:117] "RemoveContainer" containerID="4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1" Apr 25 00:36:44.907517 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:36:44.907451 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1\": container with ID starting with 4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1 not found: ID does not exist" containerID="4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1" Apr 25 00:36:44.907585 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.907530 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1"} err="failed to get container status \"4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1\": rpc error: code = NotFound desc = could not find container \"4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1\": container with ID starting with 4d365284a4213bd578610bf797b1a650141a8f6b6fc8fda9b0204e5a3a903fa1 not found: ID does not exist" Apr 25 00:36:44.907585 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.907554 2559 scope.go:117] "RemoveContainer" containerID="5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6" Apr 25 00:36:44.907821 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:36:44.907800 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6\": container with ID starting with 5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6 not found: ID does not exist" containerID="5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6" Apr 25 00:36:44.907869 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.907840 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6"} err="failed to get container status \"5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6\": rpc error: code = NotFound desc = could not find container \"5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6\": container with ID starting with 5f2ec9db58221ecf37b65967e111dc124c79040867ded2983588d154c9ae62b6 not found: ID does not exist" Apr 25 00:36:44.907869 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.907863 2559 scope.go:117] "RemoveContainer" containerID="e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830" Apr 25 00:36:44.908062 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.908045 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7d6d4sqxw5"] Apr 25 00:36:44.908202 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:36:44.908091 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830\": container with ID starting with e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830 not found: ID does not exist" containerID="e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830" Apr 25 00:36:44.908202 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.908110 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830"} err="failed to get container status \"e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830\": rpc error: code = NotFound desc = could not find container \"e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830\": container with ID starting with e24632ff690ed1de5933bfdf06846d2de2bf9db8b795d1fa490ed227c5418830 not found: ID does not exist" Apr 25 00:36:44.941439 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.941398 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" path="/var/lib/kubelet/pods/4466daf8-8260-4f8c-a22a-6d108d80d151/volumes" Apr 25 00:36:44.974084 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.974043 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvkhc\" (UniqueName: \"kubernetes.io/projected/4466daf8-8260-4f8c-a22a-6d108d80d151-kube-api-access-kvkhc\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:36:44.974084 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.974080 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4466daf8-8260-4f8c-a22a-6d108d80d151-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:36:44.974288 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:36:44.974094 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4466daf8-8260-4f8c-a22a-6d108d80d151-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:37:41.083455 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:37:41.083423 2559 scope.go:117] "RemoveContainer" containerID="2240d55fa482a8231ff74a6a19691e627b2c19d66c60783069a9ef3d53896827" Apr 25 00:37:41.091423 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:37:41.091404 2559 scope.go:117] "RemoveContainer" containerID="7e01d6f73e2ab751381ca49054530c4bc76eeaf0f686ddac58cb2f580310a42d" Apr 25 00:37:41.098655 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:37:41.098625 2559 scope.go:117] "RemoveContainer" containerID="934ce1ece6223fdf737e5d4fd7bd225f4f360ac7a13a7d0d34265357a3237d29" Apr 25 00:38:41.057685 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:38:41.057656 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:38:41.060451 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:38:41.060429 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:38:41.064443 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:38:41.064427 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:38:41.067239 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:38:41.067222 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:43:41.081484 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:43:41.081434 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:43:41.084723 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:43:41.084696 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:43:41.087361 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:43:41.087341 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:43:41.090775 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:43:41.090759 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:48:41.105954 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:48:41.105834 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:48:41.109922 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:48:41.109902 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:48:41.111645 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:48:41.111627 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:48:41.115824 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:48:41.115808 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:50:09.181413 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:09.181382 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz"] Apr 25 00:50:09.181956 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:09.181712 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="main" containerID="cri-o://bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9" gracePeriod=30 Apr 25 00:50:09.181956 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:09.181760 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="tokenizer" containerID="cri-o://5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462" gracePeriod=30 Apr 25 00:50:09.609429 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:09.609340 2559 generic.go:358] "Generic (PLEG): container finished" podID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerID="bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9" exitCode=0 Apr 25 00:50:09.609429 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:09.609362 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerDied","Data":"bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9"} Apr 25 00:50:10.028311 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028275 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-smsr4/must-gather-vwfzk"] Apr 25 00:50:10.028618 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028604 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="storage-initializer" Apr 25 00:50:10.028679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028619 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="storage-initializer" Apr 25 00:50:10.028679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028630 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="main" Apr 25 00:50:10.028679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028636 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="main" Apr 25 00:50:10.028679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028647 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="tokenizer" Apr 25 00:50:10.028679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028653 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="tokenizer" Apr 25 00:50:10.028853 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028713 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="main" Apr 25 00:50:10.028853 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.028725 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="4466daf8-8260-4f8c-a22a-6d108d80d151" containerName="tokenizer" Apr 25 00:50:10.031560 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.031543 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.033953 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.033933 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-smsr4\"/\"openshift-service-ca.crt\"" Apr 25 00:50:10.034053 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.034008 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-smsr4\"/\"kube-root-ca.crt\"" Apr 25 00:50:10.053144 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.053117 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-smsr4/must-gather-vwfzk"] Apr 25 00:50:10.164762 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.164727 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ztgn\" (UniqueName: \"kubernetes.io/projected/ac7f2609-8be2-4377-bac0-d3a34636c139-kube-api-access-2ztgn\") pod \"must-gather-vwfzk\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.164878 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.164855 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac7f2609-8be2-4377-bac0-d3a34636c139-must-gather-output\") pod \"must-gather-vwfzk\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.265775 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.265744 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac7f2609-8be2-4377-bac0-d3a34636c139-must-gather-output\") pod \"must-gather-vwfzk\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.266187 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.265901 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ztgn\" (UniqueName: \"kubernetes.io/projected/ac7f2609-8be2-4377-bac0-d3a34636c139-kube-api-access-2ztgn\") pod \"must-gather-vwfzk\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.266187 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.266089 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac7f2609-8be2-4377-bac0-d3a34636c139-must-gather-output\") pod \"must-gather-vwfzk\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.278083 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.278058 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ztgn\" (UniqueName: \"kubernetes.io/projected/ac7f2609-8be2-4377-bac0-d3a34636c139-kube-api-access-2ztgn\") pod \"must-gather-vwfzk\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.322092 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.322068 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:50:10.342280 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.342245 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:10.464711 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.464673 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-smsr4/must-gather-vwfzk"] Apr 25 00:50:10.467903 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.467885 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tls-certs\") pod \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " Apr 25 00:50:10.467978 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.467935 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc5jc\" (UniqueName: \"kubernetes.io/projected/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kube-api-access-vc5jc\") pod \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " Apr 25 00:50:10.467978 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.467954 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-uds\") pod \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " Apr 25 00:50:10.468055 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.467977 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-cache\") pod \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " Apr 25 00:50:10.468055 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468024 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kserve-provision-location\") pod \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " Apr 25 00:50:10.468160 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468078 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-tmp\") pod \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\" (UID: \"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb\") " Apr 25 00:50:10.468476 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468411 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" (UID: "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:10.468658 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468617 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-cache\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:10.468658 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468615 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" (UID: "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:10.468852 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:50:10.468672 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac7f2609_8be2_4377_bac0_d3a34636c139.slice/crio-8257bdf73bf6c0b4d52388367c5421c1aa978734bef59d38464cd2a86cbd4171 WatchSource:0}: Error finding container 8257bdf73bf6c0b4d52388367c5421c1aa978734bef59d38464cd2a86cbd4171: Status 404 returned error can't find the container with id 8257bdf73bf6c0b4d52388367c5421c1aa978734bef59d38464cd2a86cbd4171 Apr 25 00:50:10.468852 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468721 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" (UID: "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:10.468977 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.468911 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" (UID: "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:10.470147 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.470097 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" (UID: "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:50:10.470281 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.470198 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kube-api-access-vc5jc" (OuterVolumeSpecName: "kube-api-access-vc5jc") pod "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" (UID: "dde6ec62-36d8-4fc0-a97a-68885ee9d7eb"). InnerVolumeSpecName "kube-api-access-vc5jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:50:10.470346 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.470330 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:50:10.569864 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.569789 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vc5jc\" (UniqueName: \"kubernetes.io/projected/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kube-api-access-vc5jc\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:10.569864 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.569814 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-uds\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:10.569864 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.569824 2559 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-kserve-provision-location\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:10.569864 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.569834 2559 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tokenizer-tmp\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:10.569864 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.569843 2559 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb-tls-certs\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:10.614917 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.614891 2559 generic.go:358] "Generic (PLEG): container finished" podID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerID="5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462" exitCode=0 Apr 25 00:50:10.615050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.614960 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" Apr 25 00:50:10.615050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.614994 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerDied","Data":"5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462"} Apr 25 00:50:10.615050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.615026 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz" event={"ID":"dde6ec62-36d8-4fc0-a97a-68885ee9d7eb","Type":"ContainerDied","Data":"9e1fcd638f5b6310000c0309f731a2157658593b41dca234d0cd9e275d48574b"} Apr 25 00:50:10.615050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.615047 2559 scope.go:117] "RemoveContainer" containerID="5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462" Apr 25 00:50:10.619099 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.619063 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smsr4/must-gather-vwfzk" event={"ID":"ac7f2609-8be2-4377-bac0-d3a34636c139","Type":"ContainerStarted","Data":"8257bdf73bf6c0b4d52388367c5421c1aa978734bef59d38464cd2a86cbd4171"} Apr 25 00:50:10.624647 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.624612 2559 scope.go:117] "RemoveContainer" containerID="bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9" Apr 25 00:50:10.631940 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.631924 2559 scope.go:117] "RemoveContainer" containerID="2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944" Apr 25 00:50:10.639375 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.639355 2559 scope.go:117] "RemoveContainer" containerID="5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462" Apr 25 00:50:10.639707 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:50:10.639684 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462\": container with ID starting with 5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462 not found: ID does not exist" containerID="5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462" Apr 25 00:50:10.639767 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.639730 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462"} err="failed to get container status \"5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462\": rpc error: code = NotFound desc = could not find container \"5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462\": container with ID starting with 5163d28e44130c5dd2c1814726de6d120f39b639da2e952d186a0d883d8fb462 not found: ID does not exist" Apr 25 00:50:10.639767 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.639756 2559 scope.go:117] "RemoveContainer" containerID="bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9" Apr 25 00:50:10.640014 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:50:10.639992 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9\": container with ID starting with bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9 not found: ID does not exist" containerID="bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9" Apr 25 00:50:10.640108 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.640018 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9"} err="failed to get container status \"bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9\": rpc error: code = NotFound desc = could not find container \"bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9\": container with ID starting with bf71775ce7d63a7cc836440b731ca9b6ef8e00d06417071b5376f7516e41adc9 not found: ID does not exist" Apr 25 00:50:10.640108 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.640034 2559 scope.go:117] "RemoveContainer" containerID="2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944" Apr 25 00:50:10.640108 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.640034 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz"] Apr 25 00:50:10.640308 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:50:10.640288 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944\": container with ID starting with 2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944 not found: ID does not exist" containerID="2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944" Apr 25 00:50:10.640356 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.640313 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944"} err="failed to get container status \"2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944\": rpc error: code = NotFound desc = could not find container \"2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944\": container with ID starting with 2e346b14de26a71ddbaf08f84ebee385b9b9beca1ccc28f2c89d9a2d5a319944 not found: ID does not exist" Apr 25 00:50:10.643208 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.643191 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7d595b8dbl55pz"] Apr 25 00:50:10.937765 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:10.937707 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" path="/var/lib/kubelet/pods/dde6ec62-36d8-4fc0-a97a-68885ee9d7eb/volumes" Apr 25 00:50:16.646548 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:16.646498 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smsr4/must-gather-vwfzk" event={"ID":"ac7f2609-8be2-4377-bac0-d3a34636c139","Type":"ContainerStarted","Data":"42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358"} Apr 25 00:50:16.646548 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:16.646555 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smsr4/must-gather-vwfzk" event={"ID":"ac7f2609-8be2-4377-bac0-d3a34636c139","Type":"ContainerStarted","Data":"0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78"} Apr 25 00:50:16.662914 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:16.662866 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-smsr4/must-gather-vwfzk" podStartSLOduration=1.578693076 podStartE2EDuration="6.662852956s" podCreationTimestamp="2026-04-25 00:50:10 +0000 UTC" firstStartedPulling="2026-04-25 00:50:10.470493205 +0000 UTC m=+3390.112408061" lastFinishedPulling="2026-04-25 00:50:15.55465309 +0000 UTC m=+3395.196567941" observedRunningTime="2026-04-25 00:50:16.661355539 +0000 UTC m=+3396.303270428" watchObservedRunningTime="2026-04-25 00:50:16.662852956 +0000 UTC m=+3396.304767829" Apr 25 00:50:24.710062 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:24.710025 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:25.881116 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:25.881083 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:26.986329 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:26.986274 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:28.064007 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:28.063972 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:29.162287 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:29.162241 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:30.240602 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:30.240574 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:31.324548 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:31.324513 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:32.414504 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:32.414474 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:33.521353 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:33.521314 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:34.621292 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:34.621258 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:35.739094 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:35.739066 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:36.898787 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:36.898761 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:38.001241 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:38.001194 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:39.090727 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:39.090686 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-kgh6v_6fa173d8-6bc5-4153-af7d-f309faae03b5/istio-proxy/0.log" Apr 25 00:50:40.378841 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:40.378790 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rslp8_497c349c-f860-4bbe-9864-bb616471b5f7/istio-proxy/0.log" Apr 25 00:50:40.396204 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:40.396176 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84bfffbb-vgvrl_a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c/router/0.log" Apr 25 00:50:41.272593 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:41.272568 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rslp8_497c349c-f860-4bbe-9864-bb616471b5f7/istio-proxy/0.log" Apr 25 00:50:41.289405 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:41.289377 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84bfffbb-vgvrl_a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c/router/0.log" Apr 25 00:50:42.123827 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:42.123789 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rslp8_497c349c-f860-4bbe-9864-bb616471b5f7/istio-proxy/0.log" Apr 25 00:50:42.139734 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:42.139690 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84bfffbb-vgvrl_a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c/router/0.log" Apr 25 00:50:42.860684 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:42.860622 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c8wnm_51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f/authorino/0.log" Apr 25 00:50:42.895738 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:42.895716 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-p8xhd_818481f9-8390-4653-a55c-4046045ba15b/kuadrant-console-plugin/0.log" Apr 25 00:50:43.785355 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:43.785322 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c8wnm_51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f/authorino/0.log" Apr 25 00:50:43.828717 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:43.828694 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-p8xhd_818481f9-8390-4653-a55c-4046045ba15b/kuadrant-console-plugin/0.log" Apr 25 00:50:44.705798 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:44.705765 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c8wnm_51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f/authorino/0.log" Apr 25 00:50:44.743933 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:44.743909 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-p8xhd_818481f9-8390-4653-a55c-4046045ba15b/kuadrant-console-plugin/0.log" Apr 25 00:50:45.600048 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:45.600021 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c8wnm_51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f/authorino/0.log" Apr 25 00:50:45.638255 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:45.638228 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-p8xhd_818481f9-8390-4653-a55c-4046045ba15b/kuadrant-console-plugin/0.log" Apr 25 00:50:46.498147 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:46.498098 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c8wnm_51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f/authorino/0.log" Apr 25 00:50:46.540841 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:46.540813 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-p8xhd_818481f9-8390-4653-a55c-4046045ba15b/kuadrant-console-plugin/0.log" Apr 25 00:50:47.763033 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:47.763004 2559 generic.go:358] "Generic (PLEG): container finished" podID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerID="0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78" exitCode=0 Apr 25 00:50:47.763517 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:47.763072 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smsr4/must-gather-vwfzk" event={"ID":"ac7f2609-8be2-4377-bac0-d3a34636c139","Type":"ContainerDied","Data":"0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78"} Apr 25 00:50:47.763517 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:47.763397 2559 scope.go:117] "RemoveContainer" containerID="0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78" Apr 25 00:50:48.242812 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.242775 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smsr4_must-gather-vwfzk_ac7f2609-8be2-4377-bac0-d3a34636c139/gather/0.log" Apr 25 00:50:48.878635 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878603 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dtbkd/must-gather-qzlcq"] Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878930 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="tokenizer" Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878943 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="tokenizer" Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878960 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="main" Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878966 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="main" Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878979 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="storage-initializer" Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.878986 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="storage-initializer" Apr 25 00:50:48.879037 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.879039 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="tokenizer" Apr 25 00:50:48.879266 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.879048 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="dde6ec62-36d8-4fc0-a97a-68885ee9d7eb" containerName="main" Apr 25 00:50:48.883109 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.883091 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:48.885433 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.885412 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dtbkd\"/\"kube-root-ca.crt\"" Apr 25 00:50:48.885572 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.885529 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dtbkd\"/\"default-dockercfg-svszf\"" Apr 25 00:50:48.885684 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.885648 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dtbkd\"/\"openshift-service-ca.crt\"" Apr 25 00:50:48.891984 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:48.891966 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtbkd/must-gather-qzlcq"] Apr 25 00:50:49.009853 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.009815 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgn2d\" (UniqueName: \"kubernetes.io/projected/cf079c01-e4ac-43bd-ae9c-2575962ee063-kube-api-access-jgn2d\") pod \"must-gather-qzlcq\" (UID: \"cf079c01-e4ac-43bd-ae9c-2575962ee063\") " pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.009853 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.009855 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf079c01-e4ac-43bd-ae9c-2575962ee063-must-gather-output\") pod \"must-gather-qzlcq\" (UID: \"cf079c01-e4ac-43bd-ae9c-2575962ee063\") " pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.110235 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.110205 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgn2d\" (UniqueName: \"kubernetes.io/projected/cf079c01-e4ac-43bd-ae9c-2575962ee063-kube-api-access-jgn2d\") pod \"must-gather-qzlcq\" (UID: \"cf079c01-e4ac-43bd-ae9c-2575962ee063\") " pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.110235 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.110238 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf079c01-e4ac-43bd-ae9c-2575962ee063-must-gather-output\") pod \"must-gather-qzlcq\" (UID: \"cf079c01-e4ac-43bd-ae9c-2575962ee063\") " pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.110596 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.110579 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf079c01-e4ac-43bd-ae9c-2575962ee063-must-gather-output\") pod \"must-gather-qzlcq\" (UID: \"cf079c01-e4ac-43bd-ae9c-2575962ee063\") " pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.118678 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.118656 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgn2d\" (UniqueName: \"kubernetes.io/projected/cf079c01-e4ac-43bd-ae9c-2575962ee063-kube-api-access-jgn2d\") pod \"must-gather-qzlcq\" (UID: \"cf079c01-e4ac-43bd-ae9c-2575962ee063\") " pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.192750 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.192723 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtbkd/must-gather-qzlcq" Apr 25 00:50:49.317893 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.317863 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtbkd/must-gather-qzlcq"] Apr 25 00:50:49.319921 ip-10-0-129-4 kubenswrapper[2559]: W0425 00:50:49.319893 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf079c01_e4ac_43bd_ae9c_2575962ee063.slice/crio-aeef2286e7162d9d2222ea1c627c03847f61f0951573326402642a457f0da126 WatchSource:0}: Error finding container aeef2286e7162d9d2222ea1c627c03847f61f0951573326402642a457f0da126: Status 404 returned error can't find the container with id aeef2286e7162d9d2222ea1c627c03847f61f0951573326402642a457f0da126 Apr 25 00:50:49.772326 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:49.772261 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/must-gather-qzlcq" event={"ID":"cf079c01-e4ac-43bd-ae9c-2575962ee063","Type":"ContainerStarted","Data":"aeef2286e7162d9d2222ea1c627c03847f61f0951573326402642a457f0da126"} Apr 25 00:50:50.779022 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:50.778752 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/must-gather-qzlcq" event={"ID":"cf079c01-e4ac-43bd-ae9c-2575962ee063","Type":"ContainerStarted","Data":"ee4c21700341aef05b50393f3bbb6d32815ed2e7e8e5b81fedd822e54b6dda10"} Apr 25 00:50:50.779022 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:50.778794 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/must-gather-qzlcq" event={"ID":"cf079c01-e4ac-43bd-ae9c-2575962ee063","Type":"ContainerStarted","Data":"4e7b3a895151fe0a29810b59a2fc7ff3e993087f3ab2072f53ba683148e9cdea"} Apr 25 00:50:50.793140 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:50.793082 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dtbkd/must-gather-qzlcq" podStartSLOduration=1.979654224 podStartE2EDuration="2.793065996s" podCreationTimestamp="2026-04-25 00:50:48 +0000 UTC" firstStartedPulling="2026-04-25 00:50:49.321713062 +0000 UTC m=+3428.963627913" lastFinishedPulling="2026-04-25 00:50:50.135124831 +0000 UTC m=+3429.777039685" observedRunningTime="2026-04-25 00:50:50.792743955 +0000 UTC m=+3430.434658829" watchObservedRunningTime="2026-04-25 00:50:50.793065996 +0000 UTC m=+3430.434980869" Apr 25 00:50:51.666646 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:51.666610 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hb4xm_67b28161-03e9-4905-8e32-8b7353db6c58/global-pull-secret-syncer/0.log" Apr 25 00:50:51.824183 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:51.824123 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wjpkj_0ef7298e-690a-414e-92a8-45d6a5710aa9/konnectivity-agent/0.log" Apr 25 00:50:51.850414 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:51.850381 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-4.ec2.internal_b0e49161c603a5579b9b31b2ffe9b2e8/haproxy/0.log" Apr 25 00:50:53.733178 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:53.733131 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-smsr4/must-gather-vwfzk"] Apr 25 00:50:53.733669 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:53.733422 2559 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-smsr4/must-gather-vwfzk" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="copy" containerID="cri-o://42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358" gracePeriod=2 Apr 25 00:50:53.737875 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:53.737835 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-smsr4/must-gather-vwfzk"] Apr 25 00:50:54.130586 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.130444 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smsr4_must-gather-vwfzk_ac7f2609-8be2-4377-bac0-d3a34636c139/copy/0.log" Apr 25 00:50:54.131519 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.131228 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:54.133340 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.133298 2559 status_manager.go:895] "Failed to get status for pod" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" pod="openshift-must-gather-smsr4/must-gather-vwfzk" err="pods \"must-gather-vwfzk\" is forbidden: User \"system:node:ip-10-0-129-4.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-smsr4\": no relationship found between node 'ip-10-0-129-4.ec2.internal' and this object" Apr 25 00:50:54.266576 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.261036 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ztgn\" (UniqueName: \"kubernetes.io/projected/ac7f2609-8be2-4377-bac0-d3a34636c139-kube-api-access-2ztgn\") pod \"ac7f2609-8be2-4377-bac0-d3a34636c139\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " Apr 25 00:50:54.266576 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.261094 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac7f2609-8be2-4377-bac0-d3a34636c139-must-gather-output\") pod \"ac7f2609-8be2-4377-bac0-d3a34636c139\" (UID: \"ac7f2609-8be2-4377-bac0-d3a34636c139\") " Apr 25 00:50:54.272671 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.272612 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac7f2609-8be2-4377-bac0-d3a34636c139-kube-api-access-2ztgn" (OuterVolumeSpecName: "kube-api-access-2ztgn") pod "ac7f2609-8be2-4377-bac0-d3a34636c139" (UID: "ac7f2609-8be2-4377-bac0-d3a34636c139"). InnerVolumeSpecName "kube-api-access-2ztgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:50:54.273093 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.273070 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac7f2609-8be2-4377-bac0-d3a34636c139-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ac7f2609-8be2-4377-bac0-d3a34636c139" (UID: "ac7f2609-8be2-4377-bac0-d3a34636c139"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:54.361986 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.361898 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ztgn\" (UniqueName: \"kubernetes.io/projected/ac7f2609-8be2-4377-bac0-d3a34636c139-kube-api-access-2ztgn\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:54.361986 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.361946 2559 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac7f2609-8be2-4377-bac0-d3a34636c139-must-gather-output\") on node \"ip-10-0-129-4.ec2.internal\" DevicePath \"\"" Apr 25 00:50:54.798590 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.798558 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smsr4_must-gather-vwfzk_ac7f2609-8be2-4377-bac0-d3a34636c139/copy/0.log" Apr 25 00:50:54.799108 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.798934 2559 generic.go:358] "Generic (PLEG): container finished" podID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerID="42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358" exitCode=143 Apr 25 00:50:54.799108 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.799001 2559 scope.go:117] "RemoveContainer" containerID="42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358" Apr 25 00:50:54.799233 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.799123 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smsr4/must-gather-vwfzk" Apr 25 00:50:54.804675 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.804630 2559 status_manager.go:895] "Failed to get status for pod" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" pod="openshift-must-gather-smsr4/must-gather-vwfzk" err="pods \"must-gather-vwfzk\" is forbidden: User \"system:node:ip-10-0-129-4.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-smsr4\": no relationship found between node 'ip-10-0-129-4.ec2.internal' and this object" Apr 25 00:50:54.820455 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.820230 2559 status_manager.go:895] "Failed to get status for pod" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" pod="openshift-must-gather-smsr4/must-gather-vwfzk" err="pods \"must-gather-vwfzk\" is forbidden: User \"system:node:ip-10-0-129-4.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-smsr4\": no relationship found between node 'ip-10-0-129-4.ec2.internal' and this object" Apr 25 00:50:54.823112 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.822993 2559 scope.go:117] "RemoveContainer" containerID="0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78" Apr 25 00:50:54.853721 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.853685 2559 scope.go:117] "RemoveContainer" containerID="42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358" Apr 25 00:50:54.854227 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:50:54.854188 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358\": container with ID starting with 42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358 not found: ID does not exist" containerID="42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358" Apr 25 00:50:54.854343 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.854232 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358"} err="failed to get container status \"42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358\": rpc error: code = NotFound desc = could not find container \"42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358\": container with ID starting with 42ef6d5f82026a4770b3911908d4ceba4be26f675989e2aee5320b401fcfd358 not found: ID does not exist" Apr 25 00:50:54.854343 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.854260 2559 scope.go:117] "RemoveContainer" containerID="0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78" Apr 25 00:50:54.854583 ip-10-0-129-4 kubenswrapper[2559]: E0425 00:50:54.854549 2559 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78\": container with ID starting with 0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78 not found: ID does not exist" containerID="0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78" Apr 25 00:50:54.854694 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.854590 2559 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78"} err="failed to get container status \"0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78\": rpc error: code = NotFound desc = could not find container \"0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78\": container with ID starting with 0f97e031678faad1b086c749aec5f4f078c2de819e0522a6a9a3a912e1801b78 not found: ID does not exist" Apr 25 00:50:54.941384 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:54.941338 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" path="/var/lib/kubelet/pods/ac7f2609-8be2-4377-bac0-d3a34636c139/volumes" Apr 25 00:50:55.819005 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:55.818970 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-c8wnm_51ba2f33-ec5c-41d9-9d98-eb9b6a3d805f/authorino/0.log" Apr 25 00:50:55.914902 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:55.914868 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-p8xhd_818481f9-8390-4653-a55c-4046045ba15b/kuadrant-console-plugin/0.log" Apr 25 00:50:57.205437 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:57.205306 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lfc8g_f3ec85ed-0f08-4e69-b8f5-19f031ceea01/cluster-monitoring-operator/0.log" Apr 25 00:50:57.491097 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:57.491025 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zwdbl_26d781b2-1576-4d2f-acd6-2fe49496e995/node-exporter/0.log" Apr 25 00:50:57.511237 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:57.511201 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zwdbl_26d781b2-1576-4d2f-acd6-2fe49496e995/kube-rbac-proxy/0.log" Apr 25 00:50:57.533519 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:57.533492 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zwdbl_26d781b2-1576-4d2f-acd6-2fe49496e995/init-textfile/0.log" Apr 25 00:50:59.424050 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:59.424019 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-pwflc_8bd67c57-900f-4e8c-bc50-b1e0a7960a53/networking-console-plugin/0.log" Apr 25 00:50:59.945927 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:59.945888 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/2.log" Apr 25 00:50:59.946122 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:50:59.945956 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-l4ktn_85cf1b2a-d5c1-4cb4-8250-bf11078d6bf5/console-operator/1.log" Apr 25 00:51:00.493168 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493132 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg"] Apr 25 00:51:00.494848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493658 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="copy" Apr 25 00:51:00.494848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493682 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="copy" Apr 25 00:51:00.494848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493747 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="gather" Apr 25 00:51:00.494848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493758 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="gather" Apr 25 00:51:00.494848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493856 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="copy" Apr 25 00:51:00.494848 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.493879 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac7f2609-8be2-4377-bac0-d3a34636c139" containerName="gather" Apr 25 00:51:00.499751 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.499728 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.504891 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.504863 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg"] Apr 25 00:51:00.636286 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.636250 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-lib-modules\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.636606 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.636568 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-proc\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.636756 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.636741 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s69l8\" (UniqueName: \"kubernetes.io/projected/c251c5cb-987e-47f4-9e55-9229dc646630-kube-api-access-s69l8\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.636880 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.636867 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-sys\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.637060 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.637047 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-podres\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738033 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.737994 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-lib-modules\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738054 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-proc\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738083 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s69l8\" (UniqueName: \"kubernetes.io/projected/c251c5cb-987e-47f4-9e55-9229dc646630-kube-api-access-s69l8\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738109 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-sys\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738166 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-podres\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738182 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-proc\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738220 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738182 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-lib-modules\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738540 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738224 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-sys\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.738540 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.738276 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c251c5cb-987e-47f4-9e55-9229dc646630-podres\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.745534 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.745480 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69l8\" (UniqueName: \"kubernetes.io/projected/c251c5cb-987e-47f4-9e55-9229dc646630-kube-api-access-s69l8\") pod \"perf-node-gather-daemonset-2jtcg\" (UID: \"c251c5cb-987e-47f4-9e55-9229dc646630\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.815820 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.815789 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:00.893981 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.893951 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-nk5lm_c1d5758c-6883-4cf2-be1b-364659aa4379/volume-data-source-validator/0.log" Apr 25 00:51:00.961991 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:00.961932 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg"] Apr 25 00:51:01.665486 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.665433 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cmhdt_154d44c3-fd83-4c64-a18a-acbfd5167f6f/dns/0.log" Apr 25 00:51:01.686345 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.686316 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cmhdt_154d44c3-fd83-4c64-a18a-acbfd5167f6f/kube-rbac-proxy/0.log" Apr 25 00:51:01.799847 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.799821 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wkhjm_3189ee75-b94d-4dc4-a4f0-5805c80f852c/dns-node-resolver/0.log" Apr 25 00:51:01.840314 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.840273 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" event={"ID":"c251c5cb-987e-47f4-9e55-9229dc646630","Type":"ContainerStarted","Data":"075eb7af7dbf3c44a3c448e3f63444f1fb368512690b1c2649a69cc02844e8b2"} Apr 25 00:51:01.840535 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.840321 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" event={"ID":"c251c5cb-987e-47f4-9e55-9229dc646630","Type":"ContainerStarted","Data":"9834b3ff41902a702ce90bc52d930cbdae561f6e1f84b3bdb3ab64547620319f"} Apr 25 00:51:01.841128 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.841102 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:01.859425 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:01.859363 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" podStartSLOduration=1.859344426 podStartE2EDuration="1.859344426s" podCreationTimestamp="2026-04-25 00:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:51:01.857760492 +0000 UTC m=+3441.499675365" watchObservedRunningTime="2026-04-25 00:51:01.859344426 +0000 UTC m=+3441.501259300" Apr 25 00:51:02.298149 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:02.298116 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4glfj_991de149-fd35-4947-8e6c-35dfa11c084c/node-ca/0.log" Apr 25 00:51:03.231166 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:03.231138 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rslp8_497c349c-f860-4bbe-9864-bb616471b5f7/istio-proxy/0.log" Apr 25 00:51:03.256281 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:03.256246 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84bfffbb-vgvrl_a95f5cf2-cd4a-474e-b34e-1b3165d0eb8c/router/0.log" Apr 25 00:51:03.708679 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:03.708651 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tdws9_a805551d-fa54-4c4d-a5d2-b5057e7eb7a9/serve-healthcheck-canary/0.log" Apr 25 00:51:04.158422 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:04.158391 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4x6ss_f39c93ca-0e47-4091-b7b7-80b2901e8795/insights-operator/0.log" Apr 25 00:51:04.159733 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:04.159702 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4x6ss_f39c93ca-0e47-4091-b7b7-80b2901e8795/insights-operator/1.log" Apr 25 00:51:04.313126 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:04.313098 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmmfk_6af5e39d-3c39-4d8c-b886-74bd88370c79/kube-rbac-proxy/0.log" Apr 25 00:51:04.334033 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:04.334009 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmmfk_6af5e39d-3c39-4d8c-b886-74bd88370c79/exporter/0.log" Apr 25 00:51:04.355209 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:04.355178 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmmfk_6af5e39d-3c39-4d8c-b886-74bd88370c79/extractor/0.log" Apr 25 00:51:06.879103 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:06.879071 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-58fbc56fdc-wb76b_16199b27-4899-4f66-8124-758e67b43ef3/manager/0.log" Apr 25 00:51:06.901512 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:06.901485 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-hkq74_6cc8cade-b8aa-42e0-802b-bb56b9a2f9b3/openshift-lws-operator/0.log" Apr 25 00:51:07.946723 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:07.946692 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-kvvnd_96cb7b04-a754-42fc-8678-8630e199a460/s3-init/0.log" Apr 25 00:51:08.855480 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:08.855440 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2jtcg" Apr 25 00:51:12.807735 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:12.807700 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b9kkx_f8c356cb-0fee-47ee-a119-26d729d14274/kube-storage-version-migrator-operator/1.log" Apr 25 00:51:12.810689 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:12.810661 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b9kkx_f8c356cb-0fee-47ee-a119-26d729d14274/kube-storage-version-migrator-operator/0.log" Apr 25 00:51:14.103743 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.103714 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/kube-multus-additional-cni-plugins/0.log" Apr 25 00:51:14.123152 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.123124 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/egress-router-binary-copy/0.log" Apr 25 00:51:14.142253 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.142229 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/cni-plugins/0.log" Apr 25 00:51:14.161078 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.161054 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/bond-cni-plugin/0.log" Apr 25 00:51:14.183158 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.183130 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/routeoverride-cni/0.log" Apr 25 00:51:14.202915 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.202892 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/whereabouts-cni-bincopy/0.log" Apr 25 00:51:14.225290 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.225265 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-shvmr_aa3ab10f-a4b0-49f5-8458-86e3138f3237/whereabouts-cni/0.log" Apr 25 00:51:14.261706 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.261679 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dz6nf_cc64b50b-da56-49cb-b2a2-054b925980cf/kube-multus/0.log" Apr 25 00:51:14.348296 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.348264 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2s8sw_f8df9612-54a5-4673-b2cc-33d7768fe61c/network-metrics-daemon/0.log" Apr 25 00:51:14.366227 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:14.366126 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2s8sw_f8df9612-54a5-4673-b2cc-33d7768fe61c/kube-rbac-proxy/0.log" Apr 25 00:51:15.607192 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.607158 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-controller/0.log" Apr 25 00:51:15.623476 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.623439 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/0.log" Apr 25 00:51:15.654878 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.654853 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovn-acl-logging/1.log" Apr 25 00:51:15.675434 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.675395 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/kube-rbac-proxy-node/0.log" Apr 25 00:51:15.699347 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.699319 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:51:15.715536 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.715498 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/northd/0.log" Apr 25 00:51:15.735030 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.735006 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/nbdb/0.log" Apr 25 00:51:15.757442 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.757410 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/sbdb/0.log" Apr 25 00:51:15.988645 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:15.988603 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bfwmd_3a0dad7b-4a0e-485f-9092-becacb1cd8a8/ovnkube-controller/0.log" Apr 25 00:51:17.422320 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:17.422284 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-rfr4p_10b4da2b-ad8f-4af5-9e0b-28885ad2debc/check-endpoints/0.log" Apr 25 00:51:17.474856 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:17.474826 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jz5tk_800b9072-49a3-4275-947c-a73644d8448e/network-check-target-container/0.log" Apr 25 00:51:18.490555 ip-10-0-129-4 kubenswrapper[2559]: I0425 00:51:18.490525 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-skfzd_e2c1b3f5-cd6d-4849-a473-0eb71003f6b1/iptables-alerter/0.log"