Apr 24 22:29:53.966928 ip-10-0-132-138 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:54.384582 ip-10-0-132-138 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:54.384582 ip-10-0-132-138 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:54.384582 ip-10-0-132-138 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:54.385192 ip-10-0-132-138 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:54.385192 ip-10-0-132-138 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:54.386204 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.386114 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:54.390114 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390098 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:54.390114 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390114 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390118 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390121 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390125 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390128 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390131 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390134 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390137 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390139 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390142 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390146 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390149 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390152 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390155 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390157 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390160 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390162 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390165 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390167 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390170 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:54.390180 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390173 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390175 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390178 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390182 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390192 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390196 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390199 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390202 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390204 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390207 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390209 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390212 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390215 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390218 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390221 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390224 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390226 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390229 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390231 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390234 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:54.390701 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390236 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390239 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390243 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390245 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390248 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390250 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390253 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390255 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390258 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390260 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390263 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390265 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390268 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390271 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390273 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390276 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390279 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390281 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390285 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390289 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:54.391243 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390292 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390294 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390297 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390299 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390302 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390304 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390307 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390310 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390312 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390315 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390317 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390320 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390322 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390327 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390330 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390332 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390335 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390337 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390340 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390343 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:54.391742 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390345 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390348 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390350 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390352 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390356 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390772 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390778 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390781 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390784 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390787 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390790 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390792 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390795 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390798 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390800 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390803 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390805 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390808 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390810 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:54.392228 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390813 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390815 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390818 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390821 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390823 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390826 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390829 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390832 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390835 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390837 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390840 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390843 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390846 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390848 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390851 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390853 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390856 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390859 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390863 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390867 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:54.392927 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390870 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390873 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390876 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390880 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390883 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390885 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390888 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390890 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390893 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390896 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390898 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390901 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390904 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390906 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390909 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390912 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390914 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390918 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390920 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390923 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:54.393853 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390926 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390929 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390931 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390934 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390937 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390940 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390942 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390945 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390947 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390949 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390953 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390956 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390959 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390961 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390963 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390967 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390970 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390972 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390975 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390978 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:54.394498 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390980 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390983 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390985 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390988 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390990 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390993 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390995 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.390998 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.391001 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.391003 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.391006 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.391008 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392368 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392381 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392391 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392395 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392400 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392404 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392408 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392414 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392420 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:54.395234 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392437 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392468 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392476 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392482 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392488 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392493 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392499 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392505 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392509 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392514 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392524 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392529 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392534 2575 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392539 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392544 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392553 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392558 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392572 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392578 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392583 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392601 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392607 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392613 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392618 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392626 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:54.395802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392631 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392635 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392640 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392646 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392650 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392658 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392664 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392669 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392675 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392682 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392689 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392694 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392699 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392705 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392709 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392714 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392718 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392723 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392728 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392733 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392737 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392744 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392749 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392754 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392759 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392764 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:54.396468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392769 2575 flags.go:64] FLAG: --help="false" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392774 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392780 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392785 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392790 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392795 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392802 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392807 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392812 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392817 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392821 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392826 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392832 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392837 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392842 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392848 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392854 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392859 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392864 2575 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392869 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392873 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392879 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392888 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:54.397166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392893 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392898 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392903 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392908 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392913 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392918 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392923 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392931 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392936 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392942 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392947 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392952 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392957 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392962 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392967 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392972 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392977 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392992 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.392997 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393002 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393007 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393012 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393021 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393026 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:54.397781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393032 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393037 2575 flags.go:64] FLAG: --port="10250" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393043 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393048 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-004310d5082998287" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393053 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393059 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393064 2575 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393069 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393074 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393080 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393085 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393090 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393095 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393102 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393107 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393112 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393117 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393122 2575 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393126 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393131 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393136 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393141 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393146 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393150 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393155 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393161 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:54.398363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393166 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393171 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393176 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393181 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393186 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393191 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393197 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393208 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393213 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393218 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393226 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393230 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393235 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393240 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393245 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393249 2575 flags.go:64] FLAG: --v="2" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393256 2575 flags.go:64] FLAG: --version="false" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393268 2575 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393275 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393281 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393522 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393531 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393537 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393542 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:54.398988 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393546 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393551 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393555 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393560 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393564 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393570 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393577 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393581 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393585 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393607 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393612 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393616 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393620 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393624 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393629 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393633 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393637 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393642 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393647 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:54.399618 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393652 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393656 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393660 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393664 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393669 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393674 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393678 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393682 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393687 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393691 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393695 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393699 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393703 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393708 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393712 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393716 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393721 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393726 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393731 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393736 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:54.400153 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393741 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393746 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393750 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393754 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393758 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393763 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393767 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393771 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393776 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393781 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393785 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393789 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393794 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393799 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393804 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393808 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393812 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393816 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393820 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393824 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:54.400669 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393829 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393836 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393842 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393847 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393852 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393856 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393861 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393865 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393869 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393873 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393878 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393882 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393886 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393891 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393895 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393899 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393904 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393908 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393913 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393917 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:54.401146 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393922 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393926 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.393930 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.393939 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.401186 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.401207 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401264 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401270 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401274 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401277 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401280 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401284 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401287 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401291 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401293 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401296 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:54.401662 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401299 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401302 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401306 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401309 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401312 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401315 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401318 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401320 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401323 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401325 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401328 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401331 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401333 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401336 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401338 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401341 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401343 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401345 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401348 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401350 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:54.402079 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401353 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401358 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401362 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401364 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401367 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401369 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401372 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401374 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401377 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401379 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401382 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401384 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401387 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401390 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401392 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401395 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401397 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401400 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401403 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:54.402563 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401406 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401408 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401411 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401413 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401416 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401418 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401421 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401423 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401426 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401430 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401434 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401437 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401440 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401443 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401446 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401450 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401453 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401455 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401458 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:54.403134 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401462 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401465 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401468 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401470 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401473 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401475 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401478 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401481 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401483 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401486 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401488 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401491 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401494 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401496 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401499 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401501 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401504 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:54.403667 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401507 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.401512 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401645 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401649 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401653 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401656 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401658 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401661 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401664 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401667 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401670 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401674 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401677 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401680 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401682 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:54.404088 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401685 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401688 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401690 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401693 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401696 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401698 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401701 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401703 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401706 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401708 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401710 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401713 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401716 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401719 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401722 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401724 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401727 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401730 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401733 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401736 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:54.404466 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401738 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401741 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401743 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401745 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401748 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401750 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401753 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401755 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401758 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401760 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401764 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401766 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401769 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401772 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401774 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401776 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401779 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401781 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401784 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:54.404992 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401787 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401789 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401792 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401795 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401797 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401800 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401803 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401805 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401808 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401810 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401813 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401815 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401817 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401820 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401822 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401825 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401829 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401831 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401834 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401837 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:54.405438 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401839 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401842 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401844 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401847 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401850 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401853 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401855 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401858 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401861 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401865 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401868 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401871 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401873 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:54.401876 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.401881 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:54.405947 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.403186 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:54.406317 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.405292 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:54.406317 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.406225 2575 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:54.406371 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.406320 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:54.406371 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.406362 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:54.428923 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.428897 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:54.432692 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.432668 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:54.445825 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.445803 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:54.451610 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.451577 2575 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:54.452944 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.452926 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:54.458800 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.458779 2575 fs.go:135] Filesystem UUIDs: map[673ed485-ef53-46ea-a9a4-28e34867cecb:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9afefcda-f875-461d-bd0a-03ceac6cdef7:/dev/nvme0n1p3] Apr 24 22:29:54.458849 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.458800 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:54.464498 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.464387 2575 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:54.462478302 +0000 UTC m=+0.384121554 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099866 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20bc508a1a0e73ddbc1634d4f1d2cf SystemUUID:ec20bc50-8a1a-0e73-ddbc-1634d4f1d2cf BootID:75336873-b2e1-40fb-9f12-01a2ec735bd2 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:95:97:ce:a6:7d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:95:97:ce:a6:7d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:d0:62:10:ac:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:54.464498 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.464496 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:54.464656 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.464643 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:54.465645 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.465618 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:54.465793 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.465648 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-138.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:54.465843 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.465803 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:54.465843 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.465812 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:54.465843 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.465828 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:54.466560 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.466550 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:54.467460 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.467450 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:54.467566 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.467557 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:54.470017 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.470006 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:54.470061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.470026 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:54.470061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.470038 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:54.470061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.470048 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:54.470061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.470056 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:54.471107 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.471095 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:54.471158 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.471114 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:54.473975 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.473958 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:54.475569 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.475540 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:54.475693 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.475677 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:54.477538 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477521 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477542 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477552 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477563 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477570 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477577 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477583 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477600 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477608 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477614 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:54.477625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477629 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:54.477928 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.477638 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:54.479270 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.479260 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:54.479270 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.479270 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:54.483255 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.483112 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:54.483338 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.483280 2575 server.go:1295] "Started kubelet" Apr 24 22:29:54.483432 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.483377 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:54.483476 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.483424 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:54.483515 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.483504 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:54.485473 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.485453 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:54.485776 ip-10-0-132-138 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:54.488146 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.488126 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:54.488788 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.488760 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-138.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:54.488882 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.488842 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-138.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:54.489318 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.489094 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:54.494392 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.494373 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:54.494480 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.494395 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:54.494993 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.494957 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:54.495159 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495145 2575 factory.go:55] Registering systemd factory Apr 24 22:29:54.495211 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495165 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:54.495211 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495169 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:54.495303 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495195 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:54.495303 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495225 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:54.495393 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495315 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:54.495393 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495326 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:54.495482 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495465 2575 factory.go:153] Registering CRI-O factory Apr 24 22:29:54.495482 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495478 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:54.495568 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495527 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:54.495568 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495557 2575 factory.go:103] Registering Raw factory Apr 24 22:29:54.495661 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495571 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:54.496152 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.495971 2575 manager.go:319] Starting recovery of all containers Apr 24 22:29:54.496425 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.496317 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:54.499446 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.499411 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-138.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:29:54.499549 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.499491 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:29:54.500528 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.499583 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-138.ec2.internal.18a96b9011402d22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-138.ec2.internal,UID:ip-10-0-132-138.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-138.ec2.internal,},FirstTimestamp:2026-04-24 22:29:54.483252514 +0000 UTC m=+0.404895766,LastTimestamp:2026-04-24 22:29:54.483252514 +0000 UTC m=+0.404895766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-138.ec2.internal,}" Apr 24 22:29:54.502734 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.502705 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qhzrs" Apr 24 22:29:54.505366 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.505346 2575 manager.go:324] Recovery completed Apr 24 22:29:54.506950 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.506932 2575 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": readdirent /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 22:29:54.510023 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.510009 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:54.512447 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.512433 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:54.512502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.512461 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:54.512502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.512473 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:54.512935 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.512922 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:54.512935 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.512933 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:54.513014 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.512948 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:54.514914 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.514835 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-138.ec2.internal.18a96b9012fda71a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-138.ec2.internal,UID:ip-10-0-132-138.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-138.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-138.ec2.internal,},FirstTimestamp:2026-04-24 22:29:54.512447258 +0000 UTC m=+0.434090510,LastTimestamp:2026-04-24 22:29:54.512447258 +0000 UTC m=+0.434090510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-138.ec2.internal,}" Apr 24 22:29:54.516101 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.516089 2575 policy_none.go:49] "None policy: Start" Apr 24 22:29:54.516163 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.516105 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:54.516163 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.516116 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:54.516693 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.516679 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qhzrs" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.559642 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.560823 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.560850 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.560868 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.560876 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.560908 2575 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.563821 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565220 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.565298 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565308 2575 server.go:85] "Starting device plugin registration server" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565539 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565548 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565658 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565750 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.565761 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.566207 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:54.571781 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.566238 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:54.661934 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.661814 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal"] Apr 24 22:29:54.661934 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.661935 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:54.663086 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.663068 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:54.663205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.663097 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:54.663205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.663108 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:54.665395 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.665383 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:54.665528 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.665513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.665574 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.665547 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:54.665657 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.665647 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:54.666173 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666147 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:54.666173 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666160 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:54.666173 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666173 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666182 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666186 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666193 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666220 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666245 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666259 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:54.666336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.666286 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.668427 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.668414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.668472 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.668439 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:54.669087 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.669074 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:54.669164 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.669092 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:54.669164 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.669104 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:54.673371 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.673356 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.673458 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.673378 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-138.ec2.internal\": node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:54.702518 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.702487 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-138.ec2.internal\" not found" node="ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.706142 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.706120 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:54.706944 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.706927 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-138.ec2.internal\" not found" node="ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.796445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.796409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ee252bcd9db0f1642e38552116315a1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal\" (UID: \"2ee252bcd9db0f1642e38552116315a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.796445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.796443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ee252bcd9db0f1642e38552116315a1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal\" (UID: \"2ee252bcd9db0f1642e38552116315a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.796693 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.796462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/55b514acdb5c12be2a393c574a525294-config\") pod \"kube-apiserver-proxy-ip-10-0-132-138.ec2.internal\" (UID: \"55b514acdb5c12be2a393c574a525294\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.806512 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.806486 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:54.896884 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.896852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ee252bcd9db0f1642e38552116315a1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal\" (UID: \"2ee252bcd9db0f1642e38552116315a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.896884 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.896885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ee252bcd9db0f1642e38552116315a1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal\" (UID: \"2ee252bcd9db0f1642e38552116315a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.897094 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.896910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/55b514acdb5c12be2a393c574a525294-config\") pod \"kube-apiserver-proxy-ip-10-0-132-138.ec2.internal\" (UID: \"55b514acdb5c12be2a393c574a525294\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.897094 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.896955 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/55b514acdb5c12be2a393c574a525294-config\") pod \"kube-apiserver-proxy-ip-10-0-132-138.ec2.internal\" (UID: \"55b514acdb5c12be2a393c574a525294\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.897094 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.896966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ee252bcd9db0f1642e38552116315a1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal\" (UID: \"2ee252bcd9db0f1642e38552116315a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.897094 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:54.896977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ee252bcd9db0f1642e38552116315a1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal\" (UID: \"2ee252bcd9db0f1642e38552116315a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:54.906979 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:54.906947 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.004636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.004606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:55.007104 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.007088 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.009255 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.009237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" Apr 24 22:29:55.107604 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.107550 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.208162 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.208128 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.308708 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.308627 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.406208 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.406175 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:55.406919 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.406329 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:55.409348 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.409320 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.495469 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.495440 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:55.509658 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.509625 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.512759 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.512736 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:55.519274 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.519246 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:54 +0000 UTC" deadline="2027-11-22 23:20:27.942497626 +0000 UTC" Apr 24 22:29:55.519274 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.519272 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13848h50m32.423230157s" Apr 24 22:29:55.532506 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.532485 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qm6t6" Apr 24 22:29:55.538231 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:55.538187 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b514acdb5c12be2a393c574a525294.slice/crio-ed4894f469ce2b78358f8189ec919927553f7889253e6d2b78ad3e0b5eae866c WatchSource:0}: Error finding container ed4894f469ce2b78358f8189ec919927553f7889253e6d2b78ad3e0b5eae866c: Status 404 returned error can't find the container with id ed4894f469ce2b78358f8189ec919927553f7889253e6d2b78ad3e0b5eae866c Apr 24 22:29:55.538836 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:55.538814 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee252bcd9db0f1642e38552116315a1.slice/crio-dafda2b9a7fc4b0f56f282fdc38c9f4fac06c0522ae735dc391bb004844e3b58 WatchSource:0}: Error finding container dafda2b9a7fc4b0f56f282fdc38c9f4fac06c0522ae735dc391bb004844e3b58: Status 404 returned error can't find the container with id dafda2b9a7fc4b0f56f282fdc38c9f4fac06c0522ae735dc391bb004844e3b58 Apr 24 22:29:55.543821 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.543805 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:55.545388 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.545373 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qm6t6" Apr 24 22:29:55.563936 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.563855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" event={"ID":"2ee252bcd9db0f1642e38552116315a1","Type":"ContainerStarted","Data":"dafda2b9a7fc4b0f56f282fdc38c9f4fac06c0522ae735dc391bb004844e3b58"} Apr 24 22:29:55.564844 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.564815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" event={"ID":"55b514acdb5c12be2a393c574a525294","Type":"ContainerStarted","Data":"ed4894f469ce2b78358f8189ec919927553f7889253e6d2b78ad3e0b5eae866c"} Apr 24 22:29:55.610456 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.610425 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.710973 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.710936 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.743758 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.743722 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:55.779117 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.779072 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:55.811347 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:55.811310 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-138.ec2.internal\" not found" Apr 24 22:29:55.871269 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.871192 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:55.895267 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.895236 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" Apr 24 22:29:55.908143 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.908112 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:55.909057 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.909037 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" Apr 24 22:29:55.920876 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:55.920848 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:56.472498 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.472462 2575 apiserver.go:52] "Watching apiserver" Apr 24 22:29:56.483016 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.482969 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:56.483399 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.483374 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-72jb2","openshift-image-registry/node-ca-ltpdf","openshift-multus/multus-57wkd","openshift-multus/multus-additional-cni-plugins-m4nz6","openshift-multus/network-metrics-daemon-6h7k6","openshift-network-operator/iptables-alerter-bfb6l","openshift-ovn-kubernetes/ovnkube-node-9blws","kube-system/konnectivity-agent-tw4b7","kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx","openshift-cluster-node-tuning-operator/tuned-2gfr9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal","openshift-network-diagnostics/network-check-target-kb7rl"] Apr 24 22:29:56.486036 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.486012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.488181 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.488134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.489094 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.489062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h949l\"" Apr 24 22:29:56.490352 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.490324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.490470 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.490356 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:56.490777 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.490582 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:56.490777 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.490664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.491839 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.491821 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.491930 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.491876 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:56.492990 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.492802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.493290 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.493273 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.494636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.494273 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:56.494636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.494479 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:56.494636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.494498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.495308 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:56.495404 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.495367 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495631 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-v5cl9\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495701 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495746 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495783 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7xlk2\"" Apr 24 22:29:56.495845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.495661 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-86tzl\"" Apr 24 22:29:56.497279 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.497262 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:56.497613 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.497579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.499878 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.499857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.501022 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.500996 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.501119 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.501075 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.501119 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.501110 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sj7g7\"" Apr 24 22:29:56.501330 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.501141 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:56.502164 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.502144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.502483 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.502465 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tsrpb\"" Apr 24 22:29:56.502735 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.502719 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.504840 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.504822 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:56.504945 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.504900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.504945 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.504926 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:56.505329 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.505307 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-t2fll\"" Apr 24 22:29:56.505415 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.505306 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.506219 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-os-release\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506301 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-netns\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506301 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-cni-multus\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506301 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdjhh\" (UniqueName: \"kubernetes.io/projected/62d9f82c-64e8-47dd-9c00-4a979c247925-kube-api-access-wdjhh\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506301 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-system-cni-dir\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.506441 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-systemd\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506441 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-node-log\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506441 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-system-cni-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506528 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-cnibin\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506528 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-k8s-cni-cncf-io\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506528 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-kubelet\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-os-release\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.506660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.506660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/993dc3a3-4c4d-4d45-92a5-a952464091dc-host\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.506660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-var-lib-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-log-socket\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-cni-bin\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-ovnkube-config\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-daemon-config\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-kubelet\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.506834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-etc-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-cni-netd\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/993dc3a3-4c4d-4d45-92a5-a952464091dc-serviceca\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-socket-dir-parent\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-cni-bin\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.506955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7g7k\" (UniqueName: \"kubernetes.io/projected/993dc3a3-4c4d-4d45-92a5-a952464091dc-kube-api-access-f7g7k\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-etc-kubernetes\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cnibin\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.507111 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgp9v\" (UniqueName: \"kubernetes.io/projected/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-kube-api-access-wgp9v\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-run-netns\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507181 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-run-ovn-kubernetes\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-env-overrides\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cde4b0d1-afe8-471e-9274-67dea8902733-ovn-node-metrics-cert\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-conf-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfptc\" (UniqueName: \"kubernetes.io/projected/b4a4f470-707a-47cd-a98e-5cc998b168bc-kube-api-access-kfptc\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.507512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.507971 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.508382 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.507805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-ovn\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.508479 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkglj\" (UniqueName: \"kubernetes.io/projected/cde4b0d1-afe8-471e-9274-67dea8902733-kube-api-access-pkglj\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.508479 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/62d9f82c-64e8-47dd-9c00-4a979c247925-cni-binary-copy\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.508584 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-multus-certs\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.508584 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.508750 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmfl\" (UniqueName: \"kubernetes.io/projected/4d5279a2-c42c-42b0-a00f-df176466bd90-kube-api-access-vqmfl\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:56.508750 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b4a4f470-707a-47cd-a98e-5cc998b168bc-iptables-alerter-script\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.508750 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-systemd-units\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.508750 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.508696 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-hostroot\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.509415 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509299 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.509415 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4a4f470-707a-47cd-a98e-5cc998b168bc-host-slash\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.509415 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509358 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4847g\"" Apr 24 22:29:56.509415 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-slash\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.509716 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-ovnkube-script-lib\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.509716 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-cni-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.510232 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.509903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:56.510232 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.510015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.510232 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.510136 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:56.510413 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.510389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:29:56.510739 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.510487 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gng82\"" Apr 24 22:29:56.510739 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.510485 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:29:56.510739 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.510394 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:56.546275 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.546250 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:55 +0000 UTC" deadline="2027-11-20 00:37:36.257798606 +0000 UTC" Apr 24 22:29:56.546275 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.546272 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13778h7m39.711527972s" Apr 24 22:29:56.596054 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.596027 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:56.609723 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.609723 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-ovn\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.609907 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkglj\" (UniqueName: \"kubernetes.io/projected/cde4b0d1-afe8-471e-9274-67dea8902733-kube-api-access-pkglj\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.609907 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/62d9f82c-64e8-47dd-9c00-4a979c247925-cni-binary-copy\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.609907 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610014 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-ovn\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610014 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.610014 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-sys\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610014 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.609997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-tmp\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/78fc0fe0-05b7-43dc-a67f-00b59f3eaca9-agent-certs\") pod \"konnectivity-agent-tw4b7\" (UID: \"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9\") " pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-systemd-units\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-hostroot\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-var-lib-kubelet\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-hostroot\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4a4f470-707a-47cd-a98e-5cc998b168bc-host-slash\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.610167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-systemd-units\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-slash\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4a4f470-707a-47cd-a98e-5cc998b168bc-host-slash\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-ovnkube-script-lib\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-slash\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdjhh\" (UniqueName: \"kubernetes.io/projected/62d9f82c-64e8-47dd-9c00-4a979c247925-kube-api-access-wdjhh\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-system-cni-dir\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-sys-fs\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-system-cni-dir\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.610481 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-run\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610502 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-systemd\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-system-cni-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-run-systemd\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-tmp-dir\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-system-cni-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-host\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfptc\" (UniqueName: \"kubernetes.io/projected/b4a4f470-707a-47cd-a98e-5cc998b168bc-kube-api-access-kfptc\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/993dc3a3-4c4d-4d45-92a5-a952464091dc-host\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-cni-bin\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-daemon-config\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/993dc3a3-4c4d-4d45-92a5-a952464091dc-host\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/62d9f82c-64e8-47dd-9c00-4a979c247925-cni-binary-copy\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbxr\" (UniqueName: \"kubernetes.io/projected/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-kube-api-access-lqbxr\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysconfig\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-tuned\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.610940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-cni-netd\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-ovnkube-script-lib\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.610994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/993dc3a3-4c4d-4d45-92a5-a952464091dc-serviceca\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-socket-dir-parent\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-cni-netd\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-kubernetes\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/78fc0fe0-05b7-43dc-a67f-00b59f3eaca9-konnectivity-ca\") pod \"konnectivity-agent-tw4b7\" (UID: \"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9\") " pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-socket-dir-parent\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7g7k\" (UniqueName: \"kubernetes.io/projected/993dc3a3-4c4d-4d45-92a5-a952464091dc-kube-api-access-f7g7k\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-cni-bin\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cnibin\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cnibin\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysctl-conf\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-run-netns\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.611738 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-run-ovn-kubernetes\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-env-overrides\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cde4b0d1-afe8-471e-9274-67dea8902733-ovn-node-metrics-cert\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-daemon-config\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-conf-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-multus-certs\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-run-netns\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611436 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-hosts-file\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjqm\" (UniqueName: \"kubernetes.io/projected/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-kube-api-access-ssjqm\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-run-ovn-kubernetes\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/993dc3a3-4c4d-4d45-92a5-a952464091dc-serviceca\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmfl\" (UniqueName: \"kubernetes.io/projected/4d5279a2-c42c-42b0-a00f-df176466bd90-kube-api-access-vqmfl\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-conf-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b4a4f470-707a-47cd-a98e-5cc998b168bc-iptables-alerter-script\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-multus-certs\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-cni-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.612306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-os-release\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-netns\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-cni-multus\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-netns\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-os-release\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-os-release\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-env-overrides\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611758 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-cni-multus\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-multus-cni-dir\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-socket-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-os-release\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.611871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-node-log\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b4a4f470-707a-47cd-a98e-5cc998b168bc-iptables-alerter-script\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-cnibin\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-node-log\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-k8s-cni-cncf-io\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-cnibin\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-kubelet\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-kubelet\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-registration-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-etc-selinux\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-run-k8s-cni-cncf-io\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-var-lib-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-log-socket\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-ovnkube-config\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-var-lib-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-log-socket\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.612791 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysctl-d\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.612866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:57.112828045 +0000 UTC m=+3.034471286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-kubelet\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.613959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-etc-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-cni-bin\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.612980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgp9v\" (UniqueName: \"kubernetes.io/projected/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-kube-api-access-wgp9v\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-etc-openvswitch\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-modprobe-d\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cde4b0d1-afe8-471e-9274-67dea8902733-host-kubelet\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-host-var-lib-cni-bin\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-lib-modules\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-etc-kubernetes\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-device-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cde4b0d1-afe8-471e-9274-67dea8902733-ovnkube-config\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62d9f82c-64e8-47dd-9c00-4a979c247925-etc-kubernetes\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qncbd\" (UniqueName: \"kubernetes.io/projected/160853ee-1e1f-44e4-8c70-bf9ac01117b2-kube-api-access-qncbd\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.613204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-systemd\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.614790 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.614543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cde4b0d1-afe8-471e-9274-67dea8902733-ovn-node-metrics-cert\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.619195 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.619170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfptc\" (UniqueName: \"kubernetes.io/projected/b4a4f470-707a-47cd-a98e-5cc998b168bc-kube-api-access-kfptc\") pod \"iptables-alerter-bfb6l\" (UID: \"b4a4f470-707a-47cd-a98e-5cc998b168bc\") " pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.620236 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.620214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdjhh\" (UniqueName: \"kubernetes.io/projected/62d9f82c-64e8-47dd-9c00-4a979c247925-kube-api-access-wdjhh\") pod \"multus-57wkd\" (UID: \"62d9f82c-64e8-47dd-9c00-4a979c247925\") " pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.620472 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.620456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkglj\" (UniqueName: \"kubernetes.io/projected/cde4b0d1-afe8-471e-9274-67dea8902733-kube-api-access-pkglj\") pod \"ovnkube-node-9blws\" (UID: \"cde4b0d1-afe8-471e-9274-67dea8902733\") " pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.621249 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.621233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7g7k\" (UniqueName: \"kubernetes.io/projected/993dc3a3-4c4d-4d45-92a5-a952464091dc-kube-api-access-f7g7k\") pod \"node-ca-ltpdf\" (UID: \"993dc3a3-4c4d-4d45-92a5-a952464091dc\") " pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.623848 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.623829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmfl\" (UniqueName: \"kubernetes.io/projected/4d5279a2-c42c-42b0-a00f-df176466bd90-kube-api-access-vqmfl\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:56.628805 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.628776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgp9v\" (UniqueName: \"kubernetes.io/projected/50425312-0cb9-4942-aa9c-d32f6f8ba0f6-kube-api-access-wgp9v\") pod \"multus-additional-cni-plugins-m4nz6\" (UID: \"50425312-0cb9-4942-aa9c-d32f6f8ba0f6\") " pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.714646 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-sys\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714646 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-tmp\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/78fc0fe0-05b7-43dc-a67f-00b59f3eaca9-agent-certs\") pod \"konnectivity-agent-tw4b7\" (UID: \"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9\") " pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-sys\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-var-lib-kubelet\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-sys-fs\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-run\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-var-lib-kubelet\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-sys-fs\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-tmp-dir\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-run\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.714898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-host\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbxr\" (UniqueName: \"kubernetes.io/projected/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-kube-api-access-lqbxr\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysconfig\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-tuned\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.714995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-host\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-kubernetes\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/78fc0fe0-05b7-43dc-a67f-00b59f3eaca9-konnectivity-ca\") pod \"konnectivity-agent-tw4b7\" (UID: \"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9\") " pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysctl-conf\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-tmp-dir\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-hosts-file\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysconfig\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjqm\" (UniqueName: \"kubernetes.io/projected/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-kube-api-access-ssjqm\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-socket-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-registration-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.715294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-etc-selinux\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysctl-d\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-modprobe-d\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-registration-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-kubernetes\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysctl-conf\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-lib-modules\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-device-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-etc-selinux\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-lib-modules\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-socket-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/160853ee-1e1f-44e4-8c70-bf9ac01117b2-device-dir\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/78fc0fe0-05b7-43dc-a67f-00b59f3eaca9-konnectivity-ca\") pod \"konnectivity-agent-tw4b7\" (UID: \"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9\") " pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-sysctl-d\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qncbd\" (UniqueName: \"kubernetes.io/projected/160853ee-1e1f-44e4-8c70-bf9ac01117b2-kube-api-access-qncbd\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.716069 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-hosts-file\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.716986 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-systemd\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716986 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-modprobe-d\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.716986 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.715818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-systemd\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.717583 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.717560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-etc-tuned\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.717708 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.717583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-tmp\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.717708 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.717678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/78fc0fe0-05b7-43dc-a67f-00b59f3eaca9-agent-certs\") pod \"konnectivity-agent-tw4b7\" (UID: \"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9\") " pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.725977 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.725901 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:56.725977 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.725925 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:56.725977 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.725936 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.726210 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:56.726013 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:57.225991731 +0000 UTC m=+3.147634983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.726374 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.726345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjqm\" (UniqueName: \"kubernetes.io/projected/8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88-kube-api-access-ssjqm\") pod \"tuned-2gfr9\" (UID: \"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88\") " pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:56.727225 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.727198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qncbd\" (UniqueName: \"kubernetes.io/projected/160853ee-1e1f-44e4-8c70-bf9ac01117b2-kube-api-access-qncbd\") pod \"aws-ebs-csi-driver-node-j6nhx\" (UID: \"160853ee-1e1f-44e4-8c70-bf9ac01117b2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.728258 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.728238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbxr\" (UniqueName: \"kubernetes.io/projected/b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f-kube-api-access-lqbxr\") pod \"node-resolver-72jb2\" (UID: \"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f\") " pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.797854 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.797820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:29:56.804571 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.804544 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ltpdf" Apr 24 22:29:56.814326 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.814293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-57wkd" Apr 24 22:29:56.819113 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.819077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" Apr 24 22:29:56.825787 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.825764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bfb6l" Apr 24 22:29:56.831899 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.831876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-72jb2" Apr 24 22:29:56.838940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.838922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:29:56.846530 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.846507 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" Apr 24 22:29:56.851568 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.851551 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:56.852701 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:56.852683 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" Apr 24 22:29:57.118097 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.118025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:57.118247 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:57.118199 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:57.118311 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:57.118279 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.118259578 +0000 UTC m=+4.039902834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:57.177782 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.177575 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a4f470_707a_47cd_a98e_5cc998b168bc.slice/crio-a4136a4a883bbfedcfd4f30894cd297b03c118224e6b72189a16e38312f474ce WatchSource:0}: Error finding container a4136a4a883bbfedcfd4f30894cd297b03c118224e6b72189a16e38312f474ce: Status 404 returned error can't find the container with id a4136a4a883bbfedcfd4f30894cd297b03c118224e6b72189a16e38312f474ce Apr 24 22:29:57.178771 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.178710 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78fc0fe0_05b7_43dc_a67f_00b59f3eaca9.slice/crio-2f175dc21954ef40a1d677f1fea9c9305c2058566ac0fbf67b8435e3178f485e WatchSource:0}: Error finding container 2f175dc21954ef40a1d677f1fea9c9305c2058566ac0fbf67b8435e3178f485e: Status 404 returned error can't find the container with id 2f175dc21954ef40a1d677f1fea9c9305c2058566ac0fbf67b8435e3178f485e Apr 24 22:29:57.179236 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.179211 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod160853ee_1e1f_44e4_8c70_bf9ac01117b2.slice/crio-8dc52446a8f93fc7162322a3f4cb66f0befd6091e76786cda426b0ba25edcfdb WatchSource:0}: Error finding container 8dc52446a8f93fc7162322a3f4cb66f0befd6091e76786cda426b0ba25edcfdb: Status 404 returned error can't find the container with id 8dc52446a8f93fc7162322a3f4cb66f0befd6091e76786cda426b0ba25edcfdb Apr 24 22:29:57.180626 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.180223 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod993dc3a3_4c4d_4d45_92a5_a952464091dc.slice/crio-327d1678cc6bd959c4f17725ab61f07880ec87846c77febb4aa1d175b5d5b908 WatchSource:0}: Error finding container 327d1678cc6bd959c4f17725ab61f07880ec87846c77febb4aa1d175b5d5b908: Status 404 returned error can't find the container with id 327d1678cc6bd959c4f17725ab61f07880ec87846c77febb4aa1d175b5d5b908 Apr 24 22:29:57.183084 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.183026 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50425312_0cb9_4942_aa9c_d32f6f8ba0f6.slice/crio-c1989a8fff54f783cb3a1196f126299148c688af3c2934f9c344e6b137b36494 WatchSource:0}: Error finding container c1989a8fff54f783cb3a1196f126299148c688af3c2934f9c344e6b137b36494: Status 404 returned error can't find the container with id c1989a8fff54f783cb3a1196f126299148c688af3c2934f9c344e6b137b36494 Apr 24 22:29:57.184190 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.184165 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d9f82c_64e8_47dd_9c00_4a979c247925.slice/crio-860d37fda27d95495a9e59a5108e680b16794736bf30aedbe051eeb0853fda02 WatchSource:0}: Error finding container 860d37fda27d95495a9e59a5108e680b16794736bf30aedbe051eeb0853fda02: Status 404 returned error can't find the container with id 860d37fda27d95495a9e59a5108e680b16794736bf30aedbe051eeb0853fda02 Apr 24 22:29:57.185061 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.185020 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde4b0d1_afe8_471e_9274_67dea8902733.slice/crio-f3833e74a162c3126381426a43a0f27a303fe22ad60c81305e61e0196e8a6ce0 WatchSource:0}: Error finding container f3833e74a162c3126381426a43a0f27a303fe22ad60c81305e61e0196e8a6ce0: Status 404 returned error can't find the container with id f3833e74a162c3126381426a43a0f27a303fe22ad60c81305e61e0196e8a6ce0 Apr 24 22:29:57.188281 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.185843 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d48e8cb_0a88_4aa5_aa23_7d4f745d1c88.slice/crio-9bb8118f55eed8448e09ffeadaeb436ae3c0fb4915cf818839fd33cb7152d994 WatchSource:0}: Error finding container 9bb8118f55eed8448e09ffeadaeb436ae3c0fb4915cf818839fd33cb7152d994: Status 404 returned error can't find the container with id 9bb8118f55eed8448e09ffeadaeb436ae3c0fb4915cf818839fd33cb7152d994 Apr 24 22:29:57.188281 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:29:57.187291 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb489f9e3_3cc2_43d8_9554_bfa2f6c7aa6f.slice/crio-2bdafbadbafa2b12a876716845b544a283a3f59d19ae0d58dd989caff50fbe37 WatchSource:0}: Error finding container 2bdafbadbafa2b12a876716845b544a283a3f59d19ae0d58dd989caff50fbe37: Status 404 returned error can't find the container with id 2bdafbadbafa2b12a876716845b544a283a3f59d19ae0d58dd989caff50fbe37 Apr 24 22:29:57.319669 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.319636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:29:57.319815 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:57.319795 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:57.319875 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:57.319820 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:57.319875 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:57.319831 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:57.319938 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:57.319882 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.319869203 +0000 UTC m=+4.241512446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:57.546721 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.546670 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:55 +0000 UTC" deadline="2028-01-26 19:24:08.276690206 +0000 UTC" Apr 24 22:29:57.546721 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.546713 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15404h54m10.729980719s" Apr 24 22:29:57.578128 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.578091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-72jb2" event={"ID":"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f","Type":"ContainerStarted","Data":"2bdafbadbafa2b12a876716845b544a283a3f59d19ae0d58dd989caff50fbe37"} Apr 24 22:29:57.583306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.583264 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57wkd" event={"ID":"62d9f82c-64e8-47dd-9c00-4a979c247925","Type":"ContainerStarted","Data":"860d37fda27d95495a9e59a5108e680b16794736bf30aedbe051eeb0853fda02"} Apr 24 22:29:57.586713 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.586670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerStarted","Data":"c1989a8fff54f783cb3a1196f126299148c688af3c2934f9c344e6b137b36494"} Apr 24 22:29:57.591187 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.590602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ltpdf" event={"ID":"993dc3a3-4c4d-4d45-92a5-a952464091dc","Type":"ContainerStarted","Data":"327d1678cc6bd959c4f17725ab61f07880ec87846c77febb4aa1d175b5d5b908"} Apr 24 22:29:57.594116 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.594088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tw4b7" event={"ID":"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9","Type":"ContainerStarted","Data":"2f175dc21954ef40a1d677f1fea9c9305c2058566ac0fbf67b8435e3178f485e"} Apr 24 22:29:57.596083 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.596060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bfb6l" event={"ID":"b4a4f470-707a-47cd-a98e-5cc998b168bc","Type":"ContainerStarted","Data":"a4136a4a883bbfedcfd4f30894cd297b03c118224e6b72189a16e38312f474ce"} Apr 24 22:29:57.598104 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.598081 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" event={"ID":"55b514acdb5c12be2a393c574a525294","Type":"ContainerStarted","Data":"6f8edf9ee083a6cd885617b23e68ed6b519a0079ba44b367c62fa33565be291c"} Apr 24 22:29:57.602884 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.602859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" event={"ID":"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88","Type":"ContainerStarted","Data":"9bb8118f55eed8448e09ffeadaeb436ae3c0fb4915cf818839fd33cb7152d994"} Apr 24 22:29:57.610932 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.610675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"f3833e74a162c3126381426a43a0f27a303fe22ad60c81305e61e0196e8a6ce0"} Apr 24 22:29:57.617493 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:57.617454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" event={"ID":"160853ee-1e1f-44e4-8c70-bf9ac01117b2","Type":"ContainerStarted","Data":"8dc52446a8f93fc7162322a3f4cb66f0befd6091e76786cda426b0ba25edcfdb"} Apr 24 22:29:58.127683 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.127643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:58.127917 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.127855 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.127978 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.127920 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.127901372 +0000 UTC m=+6.049544614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.329900 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.329811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:29:58.330066 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.330004 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:58.330066 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.330024 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:58.330066 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.330036 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:58.330225 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.330094 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.330076625 +0000 UTC m=+6.251719886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:58.562902 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.561830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:29:58.562902 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.561958 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:29:58.562902 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.562743 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:29:58.562902 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:29:58.562860 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:29:58.656125 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.656037 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ee252bcd9db0f1642e38552116315a1" containerID="8c92a748f358d6b84f6db9d0d82386a706ed0c651affa63f6214efd39f1c5572" exitCode=0 Apr 24 22:29:58.657086 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.657006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" event={"ID":"2ee252bcd9db0f1642e38552116315a1","Type":"ContainerDied","Data":"8c92a748f358d6b84f6db9d0d82386a706ed0c651affa63f6214efd39f1c5572"} Apr 24 22:29:58.675670 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:58.675612 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-138.ec2.internal" podStartSLOduration=3.675576352 podStartE2EDuration="3.675576352s" podCreationTimestamp="2026-04-24 22:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:57.617070453 +0000 UTC m=+3.538713715" watchObservedRunningTime="2026-04-24 22:29:58.675576352 +0000 UTC m=+4.597219615" Apr 24 22:29:59.668879 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:29:59.668841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" event={"ID":"2ee252bcd9db0f1642e38552116315a1","Type":"ContainerStarted","Data":"247823d57839d340bc8fa6c0652f617758cefe9cf928484b16aedaa6440b9582"} Apr 24 22:30:00.146853 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:00.146786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:00.147062 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.146984 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:00.147062 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.147051 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:04.147031452 +0000 UTC m=+10.068674696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:00.348492 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:00.348450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:00.348693 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.348671 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:00.348693 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.348691 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:00.348808 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.348704 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:00.348808 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.348764 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:30:04.348745495 +0000 UTC m=+10.270388741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:00.562121 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:00.562085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:00.562121 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:00.562118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:00.562350 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.562234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:00.562412 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:00.562343 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:02.561631 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:02.561584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:02.562092 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:02.561738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:02.562092 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:02.561932 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:02.562092 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:02.562083 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:04.179363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:04.179258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:04.179868 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.179397 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:04.179868 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.179480 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.179458155 +0000 UTC m=+18.101101416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:04.381292 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:04.381008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:04.381292 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.381197 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:04.381292 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.381216 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:04.381292 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.381229 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:04.381292 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.381295 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.3812768 +0000 UTC m=+18.302920042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:04.562454 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:04.562419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:04.562454 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:04.562446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:04.562717 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.562559 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:04.563653 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:04.563622 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:06.561572 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:06.561533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:06.562032 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:06.561668 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:06.562032 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:06.561708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:06.562032 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:06.561767 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:08.561079 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:08.561042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:08.561521 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:08.561046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:08.561521 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:08.561167 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:08.561521 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:08.561268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:10.561874 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:10.561835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:10.562336 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:10.561894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:10.562336 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:10.562009 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:10.562336 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:10.562133 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:12.237088 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:12.237036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:12.237547 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.237209 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:12.237547 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.237289 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:28.237268332 +0000 UTC m=+34.158911580 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:12.438620 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:12.438567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:12.438791 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.438733 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:12.438791 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.438759 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:12.438791 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.438772 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:12.438932 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.438836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:30:28.438813375 +0000 UTC m=+34.360456616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:12.561489 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:12.561413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:12.561489 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:12.561457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:12.561701 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.561557 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:12.561748 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:12.561703 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:14.562292 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.562117 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:14.562855 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.562181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:14.562855 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:14.562372 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:14.562855 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:14.562435 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:14.695959 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.695922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" event={"ID":"8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88","Type":"ContainerStarted","Data":"38527cbb39532b2c00c75071a33cec290a4c26276e4b4074da8f45ea9411604f"} Apr 24 22:30:14.699396 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.698953 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"d16bfd9cb5c05850b3248cb5a7b73faae4e1728481f5e745056e99ce8a9f81e7"} Apr 24 22:30:14.699396 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.698990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"06acc23dd159fd5b6393d23cb101e9e39eb55a7aebf458ec9e6dcc7db2a8f8dd"} Apr 24 22:30:14.699396 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.699003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"f6076bcd3678409e8ba41febb77cfe2de541d018a63529220810a41841e592d0"} Apr 24 22:30:14.701889 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.701853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" event={"ID":"160853ee-1e1f-44e4-8c70-bf9ac01117b2","Type":"ContainerStarted","Data":"bf2945471db80e9ba2fa8ac7c5da66e550a6d3914d6294a5c5a920abfe8a8ba7"} Apr 24 22:30:14.703970 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.703944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-72jb2" event={"ID":"b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f","Type":"ContainerStarted","Data":"2bdd74ec40c3781ad440e8f8b43f9c349f4f3b44f3aa5add2cdf0d1f83a57688"} Apr 24 22:30:14.705397 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.705371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57wkd" event={"ID":"62d9f82c-64e8-47dd-9c00-4a979c247925","Type":"ContainerStarted","Data":"19f543d333931158405e1f045acaaab5c6283c21a9444da8a577c3685e980dd6"} Apr 24 22:30:14.706968 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.706935 2575 generic.go:358] "Generic (PLEG): container finished" podID="50425312-0cb9-4942-aa9c-d32f6f8ba0f6" containerID="3549393e551a7c050823e72eaf384c162c6cb23787358ba652889276dded6129" exitCode=0 Apr 24 22:30:14.707072 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.706972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerDied","Data":"3549393e551a7c050823e72eaf384c162c6cb23787358ba652889276dded6129"} Apr 24 22:30:14.709333 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.709282 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ltpdf" event={"ID":"993dc3a3-4c4d-4d45-92a5-a952464091dc","Type":"ContainerStarted","Data":"e3993fcfd7d2828e786a2281cfa57d0b42252e18fc92b284fee33c7b7f230bf1"} Apr 24 22:30:14.710622 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.710525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tw4b7" event={"ID":"78fc0fe0-05b7-43dc-a67f-00b59f3eaca9","Type":"ContainerStarted","Data":"4eb2341b6bff31b971559698569caa50d9c8b674dd9efecd929a66f1339a3615"} Apr 24 22:30:14.727288 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.727231 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-138.ec2.internal" podStartSLOduration=19.727212377 podStartE2EDuration="19.727212377s" podCreationTimestamp="2026-04-24 22:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:59.703255767 +0000 UTC m=+5.624899022" watchObservedRunningTime="2026-04-24 22:30:14.727212377 +0000 UTC m=+20.648855641" Apr 24 22:30:14.748551 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.748499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2gfr9" podStartSLOduration=4.109726873 podStartE2EDuration="20.748467306s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.188033388 +0000 UTC m=+3.109676627" lastFinishedPulling="2026-04-24 22:30:13.82677382 +0000 UTC m=+19.748417060" observedRunningTime="2026-04-24 22:30:14.727350953 +0000 UTC m=+20.648994226" watchObservedRunningTime="2026-04-24 22:30:14.748467306 +0000 UTC m=+20.670110568" Apr 24 22:30:14.789033 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.788990 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ltpdf" podStartSLOduration=4.146683555 podStartE2EDuration="20.788970401s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.1826379 +0000 UTC m=+3.104281141" lastFinishedPulling="2026-04-24 22:30:13.824924747 +0000 UTC m=+19.746567987" observedRunningTime="2026-04-24 22:30:14.748117415 +0000 UTC m=+20.669760675" watchObservedRunningTime="2026-04-24 22:30:14.788970401 +0000 UTC m=+20.710613662" Apr 24 22:30:14.815026 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.814979 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-72jb2" podStartSLOduration=4.1791985050000005 podStartE2EDuration="20.81496396s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.189018295 +0000 UTC m=+3.110661542" lastFinishedPulling="2026-04-24 22:30:13.824783743 +0000 UTC m=+19.746426997" observedRunningTime="2026-04-24 22:30:14.814739796 +0000 UTC m=+20.736383056" watchObservedRunningTime="2026-04-24 22:30:14.81496396 +0000 UTC m=+20.736607220" Apr 24 22:30:14.867280 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.867228 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-57wkd" podStartSLOduration=4.194749133 podStartE2EDuration="20.867210348s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.186330767 +0000 UTC m=+3.107974019" lastFinishedPulling="2026-04-24 22:30:13.858791995 +0000 UTC m=+19.780435234" observedRunningTime="2026-04-24 22:30:14.867103363 +0000 UTC m=+20.788746634" watchObservedRunningTime="2026-04-24 22:30:14.867210348 +0000 UTC m=+20.788853610" Apr 24 22:30:14.901489 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:14.901428 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tw4b7" podStartSLOduration=4.257890823 podStartE2EDuration="20.901411639s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.181221901 +0000 UTC m=+3.102865144" lastFinishedPulling="2026-04-24 22:30:13.824742715 +0000 UTC m=+19.746385960" observedRunningTime="2026-04-24 22:30:14.900809817 +0000 UTC m=+20.822453078" watchObservedRunningTime="2026-04-24 22:30:14.901411639 +0000 UTC m=+20.823054931" Apr 24 22:30:15.434413 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.434187 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:15.578476 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.578358 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:15.434395131Z","UUID":"4557ac8b-5d2e-44af-a9af-3275ce2f0bf6","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:15.580863 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.580824 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:15.580863 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.580860 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:15.610205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.610169 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:30:15.610943 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.610922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:30:15.714149 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.714108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bfb6l" event={"ID":"b4a4f470-707a-47cd-a98e-5cc998b168bc","Type":"ContainerStarted","Data":"1bc45369058b4c952069f55ae39504d2d1818c9d74ce473e912adcceb3d8cf26"} Apr 24 22:30:15.717217 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.717187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"a7d6460d08a7c4682b94d1056d3a8d630734b6779c74ef3f4d728a5f8e4f13f4"} Apr 24 22:30:15.717217 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.717216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"82f0ed2720721e6a8ca0ae4c6a8b0791b9419812bae3764044b9d9f2835bbf64"} Apr 24 22:30:15.717391 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.717231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"91deb3fb533d354aac730dcba88ffab3c6025d0cf1bddb48140d0122be314191"} Apr 24 22:30:15.718951 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.718910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" event={"ID":"160853ee-1e1f-44e4-8c70-bf9ac01117b2","Type":"ContainerStarted","Data":"42b0799fe1766a4333f4983ab577ab9ee42f2802ae36f303bc79f42264eb9711"} Apr 24 22:30:15.719644 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.719618 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:30:15.720053 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.720030 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tw4b7" Apr 24 22:30:15.740044 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:15.739989 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bfb6l" podStartSLOduration=5.095332087 podStartE2EDuration="21.739970489s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.180283244 +0000 UTC m=+3.101926499" lastFinishedPulling="2026-04-24 22:30:13.824921652 +0000 UTC m=+19.746564901" observedRunningTime="2026-04-24 22:30:15.739805505 +0000 UTC m=+21.661448765" watchObservedRunningTime="2026-04-24 22:30:15.739970489 +0000 UTC m=+21.661613753" Apr 24 22:30:16.561154 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:16.561118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:16.561389 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:16.561246 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:16.561389 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:16.561281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:16.561466 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:16.561408 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:17.724972 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:17.724928 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" event={"ID":"160853ee-1e1f-44e4-8c70-bf9ac01117b2","Type":"ContainerStarted","Data":"4fb7bd2b153d5d19d971476bdb7489eea861f445b9769c07e0af7f8813c04344"} Apr 24 22:30:17.743096 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:17.743047 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j6nhx" podStartSLOduration=4.31165809 podStartE2EDuration="23.743032719s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.181719019 +0000 UTC m=+3.103362262" lastFinishedPulling="2026-04-24 22:30:16.613093646 +0000 UTC m=+22.534736891" observedRunningTime="2026-04-24 22:30:17.74302969 +0000 UTC m=+23.664672952" watchObservedRunningTime="2026-04-24 22:30:17.743032719 +0000 UTC m=+23.664675981" Apr 24 22:30:18.561511 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:18.561478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:18.561718 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:18.561622 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:18.561718 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:18.561667 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:18.561840 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:18.561814 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:19.731840 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:19.731805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"2aa98723cb0858b1e827eadc2246001bc3d1acfce5499f91e1df6fde2ac48f02"} Apr 24 22:30:19.733289 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:19.733262 2575 generic.go:358] "Generic (PLEG): container finished" podID="50425312-0cb9-4942-aa9c-d32f6f8ba0f6" containerID="df5261beffbfa8e682246905af90f706c894f0b8645ba38925c2936f4831ffa8" exitCode=0 Apr 24 22:30:19.733409 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:19.733302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerDied","Data":"df5261beffbfa8e682246905af90f706c894f0b8645ba38925c2936f4831ffa8"} Apr 24 22:30:20.561321 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:20.561288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:20.561321 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:20.561316 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:20.561544 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:20.561420 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:20.561544 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:20.561481 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:20.737472 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:20.737438 2575 generic.go:358] "Generic (PLEG): container finished" podID="50425312-0cb9-4942-aa9c-d32f6f8ba0f6" containerID="f1a541faf7275a274ae8bb3fec25f5795272f8159089d4d098f992f7390605f9" exitCode=0 Apr 24 22:30:20.737951 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:20.737484 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerDied","Data":"f1a541faf7275a274ae8bb3fec25f5795272f8159089d4d098f992f7390605f9"} Apr 24 22:30:21.741059 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.740810 2575 generic.go:358] "Generic (PLEG): container finished" podID="50425312-0cb9-4942-aa9c-d32f6f8ba0f6" containerID="08b5c12ed5b4a676cf9088717257a2200ed57914e5672fe4c8645a8a920796d3" exitCode=0 Apr 24 22:30:21.741502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.740887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerDied","Data":"08b5c12ed5b4a676cf9088717257a2200ed57914e5672fe4c8645a8a920796d3"} Apr 24 22:30:21.744292 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.744270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" event={"ID":"cde4b0d1-afe8-471e-9274-67dea8902733","Type":"ContainerStarted","Data":"aef4f64636bfd3ec428f7ff005091c071e2050defcc90a1c43a548b2551d0049"} Apr 24 22:30:21.744565 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.744547 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:30:21.744654 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.744579 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:30:21.744654 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.744613 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:30:21.759264 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.759235 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:30:21.759398 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.759334 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:30:21.802011 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:21.801957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" podStartSLOduration=10.780125778 podStartE2EDuration="27.801942373s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.187643693 +0000 UTC m=+3.109286946" lastFinishedPulling="2026-04-24 22:30:14.209460287 +0000 UTC m=+20.131103541" observedRunningTime="2026-04-24 22:30:21.801856545 +0000 UTC m=+27.723499808" watchObservedRunningTime="2026-04-24 22:30:21.801942373 +0000 UTC m=+27.723585641" Apr 24 22:30:22.562724 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:22.562170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:22.562724 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:22.562300 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:22.562724 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:22.562382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:22.562724 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:22.562506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:22.889049 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:22.888957 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6h7k6"] Apr 24 22:30:22.889457 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:22.889107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:22.889457 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:22.889232 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:22.891332 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:22.891277 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kb7rl"] Apr 24 22:30:22.891478 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:22.891394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:22.891546 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:22.891490 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:24.565659 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:24.565453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:24.566072 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:24.565459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:24.566072 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:24.565732 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:24.566072 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:24.565854 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:26.561847 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:26.561809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:26.562411 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:26.561854 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:26.562411 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:26.561951 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:30:26.562411 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:26.562042 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kb7rl" podUID="b85211f3-5059-45ad-85fd-0c5901095d1e" Apr 24 22:30:26.943941 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:26.943914 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-138.ec2.internal" event="NodeReady" Apr 24 22:30:26.944170 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:26.944078 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:27.013319 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.013284 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rmzcn"] Apr 24 22:30:27.017703 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.017677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.022976 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.022947 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:27.023123 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.022983 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:27.023123 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.022993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:27.023123 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.023027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zzr8g\"" Apr 24 22:30:27.029240 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.029220 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9ljzs"] Apr 24 22:30:27.032281 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.032262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.038437 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.038228 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jqx86\"" Apr 24 22:30:27.038437 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.038249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:27.038437 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.038396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:27.039228 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.039206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rmzcn"] Apr 24 22:30:27.050855 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.050831 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9ljzs"] Apr 24 22:30:27.153861 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.153826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.153861 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.153862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.154082 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.153953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de7eead8-356f-4ed5-a05e-ca346be1cd7c-tmp-dir\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.154082 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.154024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7eead8-356f-4ed5-a05e-ca346be1cd7c-config-volume\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.154082 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.154058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8298z\" (UniqueName: \"kubernetes.io/projected/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-kube-api-access-8298z\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.154208 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.154084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plpnd\" (UniqueName: \"kubernetes.io/projected/de7eead8-356f-4ed5-a05e-ca346be1cd7c-kube-api-access-plpnd\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.255514 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.255416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8298z\" (UniqueName: \"kubernetes.io/projected/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-kube-api-access-8298z\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.255514 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.255470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plpnd\" (UniqueName: \"kubernetes.io/projected/de7eead8-356f-4ed5-a05e-ca346be1cd7c-kube-api-access-plpnd\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.255770 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.255517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.255770 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.255543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.255770 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.255642 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:27.255770 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.255680 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:27.255770 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.255698 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.755679386 +0000 UTC m=+33.677322624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:27.255770 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.255740 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.755722387 +0000 UTC m=+33.677365634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:30:27.256050 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.255790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de7eead8-356f-4ed5-a05e-ca346be1cd7c-tmp-dir\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.256050 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.255824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7eead8-356f-4ed5-a05e-ca346be1cd7c-config-volume\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.256260 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.256226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de7eead8-356f-4ed5-a05e-ca346be1cd7c-tmp-dir\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.256391 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.256372 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7eead8-356f-4ed5-a05e-ca346be1cd7c-config-volume\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.268271 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.268243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plpnd\" (UniqueName: \"kubernetes.io/projected/de7eead8-356f-4ed5-a05e-ca346be1cd7c-kube-api-access-plpnd\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.268403 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.268244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8298z\" (UniqueName: \"kubernetes.io/projected/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-kube-api-access-8298z\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.758074 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.758039 2575 generic.go:358] "Generic (PLEG): container finished" podID="50425312-0cb9-4942-aa9c-d32f6f8ba0f6" containerID="e7cba778de4a9eb931f309ce082d31a04a8f0c409fd6687258c3a501e471f311" exitCode=0 Apr 24 22:30:27.758693 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.758084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerDied","Data":"e7cba778de4a9eb931f309ce082d31a04a8f0c409fd6687258c3a501e471f311"} Apr 24 22:30:27.759894 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.759873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:27.759988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:27.759899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:27.760057 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.759999 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:27.760057 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.760043 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:28.760030154 +0000 UTC m=+34.681673393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:27.760166 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.760000 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:27.760166 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:27.760130 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:28.76011638 +0000 UTC m=+34.681759620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:30:28.264779 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.264748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:28.264977 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.264926 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:28.265040 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.265004 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:00.264984814 +0000 UTC m=+66.186628064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:28.465715 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.465670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:28.465884 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.465825 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:28.465884 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.465847 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:28.465884 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.465858 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j4ldb for pod openshift-network-diagnostics/network-check-target-kb7rl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:28.466004 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.465910 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb podName:b85211f3-5059-45ad-85fd-0c5901095d1e nodeName:}" failed. No retries permitted until 2026-04-24 22:31:00.465896128 +0000 UTC m=+66.387539373 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4ldb" (UniqueName: "kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb") pod "network-check-target-kb7rl" (UID: "b85211f3-5059-45ad-85fd-0c5901095d1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:28.564784 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.564708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:30:28.564935 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.564709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:30:28.569751 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.569722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:28.570780 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.570758 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:28.570780 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.570772 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h92dk\"" Apr 24 22:30:28.570960 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.570795 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:28.570960 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.570860 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgksw\"" Apr 24 22:30:28.762666 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.762634 2575 generic.go:358] "Generic (PLEG): container finished" podID="50425312-0cb9-4942-aa9c-d32f6f8ba0f6" containerID="a9ae4b03dffb7de729e961577c63dae22c4a4ccffe13b04269a9cb748953d8d2" exitCode=0 Apr 24 22:30:28.763096 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.762696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerDied","Data":"a9ae4b03dffb7de729e961577c63dae22c4a4ccffe13b04269a9cb748953d8d2"} Apr 24 22:30:28.768280 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.768258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:28.768376 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:28.768289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:28.768433 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.768381 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:28.768433 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.768382 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:28.768511 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.768437 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.768421234 +0000 UTC m=+36.690064474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:28.768511 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:28.768455 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.768445996 +0000 UTC m=+36.690089236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:30:29.767062 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:29.767028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" event={"ID":"50425312-0cb9-4942-aa9c-d32f6f8ba0f6","Type":"ContainerStarted","Data":"5620dbe73ae6b9521faeede528f5721ac02a3e63dac42c5dff7c22b142b442de"} Apr 24 22:30:29.799218 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:29.799170 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m4nz6" podStartSLOduration=5.640238651 podStartE2EDuration="35.799156174s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:29:57.184974503 +0000 UTC m=+3.106617746" lastFinishedPulling="2026-04-24 22:30:27.343892031 +0000 UTC m=+33.265535269" observedRunningTime="2026-04-24 22:30:29.798549334 +0000 UTC m=+35.720192596" watchObservedRunningTime="2026-04-24 22:30:29.799156174 +0000 UTC m=+35.720799435" Apr 24 22:30:30.782606 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:30.782418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:30.782982 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:30.782632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:30.782982 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:30.782558 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:30.782982 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:30.782718 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:30.782982 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:30.782739 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:34.782723014 +0000 UTC m=+40.704366252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:30:30.782982 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:30.782760 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:34.782746958 +0000 UTC m=+40.704390198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:34.811352 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:34.811313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:34.811352 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:34.811353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:34.811810 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:34.811475 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:34.811810 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:34.811541 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.811527694 +0000 UTC m=+48.733170933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:34.811810 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:34.811475 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:34.811810 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:34.811646 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.811629227 +0000 UTC m=+48.733272479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:30:42.867580 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:42.867545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:42.868007 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:42.867585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:42.868007 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:42.867704 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:42.868007 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:42.867762 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:58.867747759 +0000 UTC m=+64.789391003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:30:42.868007 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:42.867778 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:42.868007 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:42.867832 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:58.867816545 +0000 UTC m=+64.789459784 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:53.757367 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:53.757339 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9blws" Apr 24 22:30:58.877717 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:58.877676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:30:58.877717 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:30:58.877716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:30:58.878209 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:58.877850 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:58.878209 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:58.877911 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:31:30.877897688 +0000 UTC m=+96.799540928 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:30:58.878209 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:58.877850 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:58.878209 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:30:58.877981 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:31:30.877974112 +0000 UTC m=+96.799617351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:31:00.286885 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.286847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:31:00.289507 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.289484 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:31:00.297729 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:31:00.297707 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:31:00.297834 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:31:00.297797 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:04.297774618 +0000 UTC m=+130.219417868 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : secret "metrics-daemon-secret" not found Apr 24 22:31:00.487842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.487802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:31:00.490939 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.490915 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:31:00.500721 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.500699 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:31:00.511864 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.511841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ldb\" (UniqueName: \"kubernetes.io/projected/b85211f3-5059-45ad-85fd-0c5901095d1e-kube-api-access-j4ldb\") pod \"network-check-target-kb7rl\" (UID: \"b85211f3-5059-45ad-85fd-0c5901095d1e\") " pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:31:00.681850 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.681774 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgksw\"" Apr 24 22:31:00.689136 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.689107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:31:00.843132 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:00.843101 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kb7rl"] Apr 24 22:31:00.846861 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:31:00.846833 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85211f3_5059_45ad_85fd_0c5901095d1e.slice/crio-6fccb301162bd53a44bc8ff7dffe42d313a7bfa321b5bb232613b5686ef11fbc WatchSource:0}: Error finding container 6fccb301162bd53a44bc8ff7dffe42d313a7bfa321b5bb232613b5686ef11fbc: Status 404 returned error can't find the container with id 6fccb301162bd53a44bc8ff7dffe42d313a7bfa321b5bb232613b5686ef11fbc Apr 24 22:31:01.827389 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:01.827348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kb7rl" event={"ID":"b85211f3-5059-45ad-85fd-0c5901095d1e","Type":"ContainerStarted","Data":"6fccb301162bd53a44bc8ff7dffe42d313a7bfa321b5bb232613b5686ef11fbc"} Apr 24 22:31:03.832313 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:03.832219 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kb7rl" event={"ID":"b85211f3-5059-45ad-85fd-0c5901095d1e","Type":"ContainerStarted","Data":"089304c9f9ef631d20e4d59d59a3a49b19512c681ee70205fd05258f91c56123"} Apr 24 22:31:03.832708 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:03.832363 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:31:03.849342 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:03.849295 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kb7rl" podStartSLOduration=67.207746119 podStartE2EDuration="1m9.849280973s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:31:00.848660183 +0000 UTC m=+66.770303426" lastFinishedPulling="2026-04-24 22:31:03.490195027 +0000 UTC m=+69.411838280" observedRunningTime="2026-04-24 22:31:03.848946254 +0000 UTC m=+69.770589559" watchObservedRunningTime="2026-04-24 22:31:03.849280973 +0000 UTC m=+69.770924233" Apr 24 22:31:30.885912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:30.885781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:31:30.885912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:30.885818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:31:30.886436 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:31:30.885941 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:30.886436 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:31:30.885956 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:30.886436 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:31:30.886006 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert podName:234a1a93-e9ac-4d93-9c1a-57d82a34f0fb nodeName:}" failed. No retries permitted until 2026-04-24 22:32:34.885988015 +0000 UTC m=+160.807631257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert") pod "ingress-canary-rmzcn" (UID: "234a1a93-e9ac-4d93-9c1a-57d82a34f0fb") : secret "canary-serving-cert" not found Apr 24 22:31:30.886436 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:31:30.886054 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls podName:de7eead8-356f-4ed5-a05e-ca346be1cd7c nodeName:}" failed. No retries permitted until 2026-04-24 22:32:34.886031209 +0000 UTC m=+160.807674458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls") pod "dns-default-9ljzs" (UID: "de7eead8-356f-4ed5-a05e-ca346be1cd7c") : secret "dns-default-metrics-tls" not found Apr 24 22:31:34.836118 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:31:34.836090 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kb7rl" Apr 24 22:32:04.313167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:04.313107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:32:04.313690 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:04.313235 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:32:04.313690 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:04.313297 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs podName:4d5279a2-c42c-42b0-a00f-df176466bd90 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:06.313282485 +0000 UTC m=+252.234925725 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs") pod "network-metrics-daemon-6h7k6" (UID: "4d5279a2-c42c-42b0-a00f-df176466bd90") : secret "metrics-daemon-secret" not found Apr 24 22:32:14.785547 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.785511 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j"] Apr 24 22:32:14.794253 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.794226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.797940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.797917 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:32:14.799082 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.799059 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2z2cf\"" Apr 24 22:32:14.799191 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.799097 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 22:32:14.799191 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.799061 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 22:32:14.799299 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.799188 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:32:14.805323 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.805303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j"] Apr 24 22:32:14.884326 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.884297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/544ed2db-d8bb-44fd-824b-848a3cc34ab7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.884326 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.884330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8kr\" (UniqueName: \"kubernetes.io/projected/544ed2db-d8bb-44fd-824b-848a3cc34ab7-kube-api-access-sp8kr\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.884526 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.884376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.885914 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.885885 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6"] Apr 24 22:32:14.889161 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.889135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" Apr 24 22:32:14.892174 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.892151 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:32:14.892297 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.892222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 22:32:14.892362 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.892332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-jvz6d\"" Apr 24 22:32:14.892502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.892482 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-86466888b4-mw72x"] Apr 24 22:32:14.895266 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.895248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:14.897701 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.897678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-t9m85\"" Apr 24 22:32:14.897804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.897678 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 22:32:14.897866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.897803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6"] Apr 24 22:32:14.897866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.897830 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 22:32:14.897989 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.897973 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 22:32:14.898195 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.898180 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 22:32:14.898305 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.898218 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 22:32:14.898305 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.898240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 22:32:14.909265 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.909240 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86466888b4-mw72x"] Apr 24 22:32:14.982952 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.982916 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv"] Apr 24 22:32:14.985354 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-stats-auth\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:14.985502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/544ed2db-d8bb-44fd-824b-848a3cc34ab7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.985502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp8kr\" (UniqueName: \"kubernetes.io/projected/544ed2db-d8bb-44fd-824b-848a3cc34ab7-kube-api-access-sp8kr\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.985502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:14.985502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-default-certificate\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:14.985502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:14.985756 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.985756 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5wl\" (UniqueName: \"kubernetes.io/projected/4f41768b-12df-4ac4-ab57-68ec0bada16d-kube-api-access-hr5wl\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:14.985756 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qm8\" (UniqueName: \"kubernetes.io/projected/6ab9f5f8-559e-4176-b778-01a1a51317a4-kube-api-access-27qm8\") pod \"volume-data-source-validator-7c6cbb6c87-vx8c6\" (UID: \"6ab9f5f8-559e-4176-b778-01a1a51317a4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" Apr 24 22:32:14.985756 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:14.985635 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:14.985897 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:14.985786 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls podName:544ed2db-d8bb-44fd-824b-848a3cc34ab7 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:15.485767655 +0000 UTC m=+141.407410895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cnh9j" (UID: "544ed2db-d8bb-44fd-824b-848a3cc34ab7") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:14.985897 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.985822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" Apr 24 22:32:14.986169 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.986143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/544ed2db-d8bb-44fd-824b-848a3cc34ab7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:14.989986 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.989957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-54mtb\"" Apr 24 22:32:14.992092 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.992051 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx"] Apr 24 22:32:14.994881 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.994841 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t"] Apr 24 22:32:14.994989 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.994978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:14.997654 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.997634 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:14.999338 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:14.999320 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv"] Apr 24 22:32:15.005549 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.005529 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 22:32:15.005705 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.005682 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 22:32:15.005813 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.005618 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 22:32:15.006182 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.006163 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qd8rl\"" Apr 24 22:32:15.006365 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.006253 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:32:15.007550 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.007530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 22:32:15.007660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.007640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 22:32:15.009478 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.009463 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:32:15.011739 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.011722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 22:32:15.012314 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.012295 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-s7wjv\"" Apr 24 22:32:15.013225 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.013209 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx"] Apr 24 22:32:15.019713 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.019688 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t"] Apr 24 22:32:15.030803 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.030782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp8kr\" (UniqueName: \"kubernetes.io/projected/544ed2db-d8bb-44fd-824b-848a3cc34ab7-kube-api-access-sp8kr\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:15.086268 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0783a162-f638-447b-b28a-38a88c620edb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.086268 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t95qm\" (UniqueName: \"kubernetes.io/projected/352df9b5-3d38-4772-8e28-cff124503696-kube-api-access-t95qm\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.086268 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5wl\" (UniqueName: \"kubernetes.io/projected/4f41768b-12df-4ac4-ab57-68ec0bada16d-kube-api-access-hr5wl\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.086564 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27n9k\" (UniqueName: \"kubernetes.io/projected/bfcbbed3-11b9-4b33-aa07-519bf4877cdd-kube-api-access-27n9k\") pod \"network-check-source-8894fc9bd-fc9xv\" (UID: \"bfcbbed3-11b9-4b33-aa07-519bf4877cdd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" Apr 24 22:32:15.086564 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27qm8\" (UniqueName: \"kubernetes.io/projected/6ab9f5f8-559e-4176-b778-01a1a51317a4-kube-api-access-27qm8\") pod \"volume-data-source-validator-7c6cbb6c87-vx8c6\" (UID: \"6ab9f5f8-559e-4176-b778-01a1a51317a4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" Apr 24 22:32:15.086564 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352df9b5-3d38-4772-8e28-cff124503696-config\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.086564 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-stats-auth\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.086564 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0783a162-f638-447b-b28a-38a88c620edb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.086635 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-default-certificate\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.086697 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:15.586678137 +0000 UTC m=+141.508321396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : secret "router-metrics-certs-default" not found Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352df9b5-3d38-4772-8e28-cff124503696-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.086842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.086767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxcj\" (UniqueName: \"kubernetes.io/projected/0783a162-f638-447b-b28a-38a88c620edb-kube-api-access-zrxcj\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.087126 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.086852 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:15.586840858 +0000 UTC m=+141.508484101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : configmap references non-existent config key: service-ca.crt Apr 24 22:32:15.088886 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.088860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-stats-auth\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.089003 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.088982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-default-certificate\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.096179 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.096143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qm8\" (UniqueName: \"kubernetes.io/projected/6ab9f5f8-559e-4176-b778-01a1a51317a4-kube-api-access-27qm8\") pod \"volume-data-source-validator-7c6cbb6c87-vx8c6\" (UID: \"6ab9f5f8-559e-4176-b778-01a1a51317a4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" Apr 24 22:32:15.096967 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.096949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5wl\" (UniqueName: \"kubernetes.io/projected/4f41768b-12df-4ac4-ab57-68ec0bada16d-kube-api-access-hr5wl\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.187100 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxcj\" (UniqueName: \"kubernetes.io/projected/0783a162-f638-447b-b28a-38a88c620edb-kube-api-access-zrxcj\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.187249 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0783a162-f638-447b-b28a-38a88c620edb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.187249 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t95qm\" (UniqueName: \"kubernetes.io/projected/352df9b5-3d38-4772-8e28-cff124503696-kube-api-access-t95qm\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.187350 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27n9k\" (UniqueName: \"kubernetes.io/projected/bfcbbed3-11b9-4b33-aa07-519bf4877cdd-kube-api-access-27n9k\") pod \"network-check-source-8894fc9bd-fc9xv\" (UID: \"bfcbbed3-11b9-4b33-aa07-519bf4877cdd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" Apr 24 22:32:15.187350 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352df9b5-3d38-4772-8e28-cff124503696-config\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.187444 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0783a162-f638-447b-b28a-38a88c620edb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.187444 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352df9b5-3d38-4772-8e28-cff124503696-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.187671 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0783a162-f638-447b-b28a-38a88c620edb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.187884 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.187861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352df9b5-3d38-4772-8e28-cff124503696-config\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.189526 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.189506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0783a162-f638-447b-b28a-38a88c620edb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.189581 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.189544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352df9b5-3d38-4772-8e28-cff124503696-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.195141 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.195115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t95qm\" (UniqueName: \"kubernetes.io/projected/352df9b5-3d38-4772-8e28-cff124503696-kube-api-access-t95qm\") pod \"service-ca-operator-d6fc45fc5-x7x2t\" (UID: \"352df9b5-3d38-4772-8e28-cff124503696\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.195382 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.195364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxcj\" (UniqueName: \"kubernetes.io/projected/0783a162-f638-447b-b28a-38a88c620edb-kube-api-access-zrxcj\") pod \"kube-storage-version-migrator-operator-6769c5d45-t2xzx\" (UID: \"0783a162-f638-447b-b28a-38a88c620edb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.195449 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.195431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27n9k\" (UniqueName: \"kubernetes.io/projected/bfcbbed3-11b9-4b33-aa07-519bf4877cdd-kube-api-access-27n9k\") pod \"network-check-source-8894fc9bd-fc9xv\" (UID: \"bfcbbed3-11b9-4b33-aa07-519bf4877cdd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" Apr 24 22:32:15.200235 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.200219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" Apr 24 22:32:15.295074 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.295036 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" Apr 24 22:32:15.303764 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.303740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" Apr 24 22:32:15.309400 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.309375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" Apr 24 22:32:15.328621 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.327957 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6"] Apr 24 22:32:15.332064 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:15.332036 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab9f5f8_559e_4176_b778_01a1a51317a4.slice/crio-45ae09c9aec1ac4db8ddf498530df731d58cd6178720dc184eed594e35e69039 WatchSource:0}: Error finding container 45ae09c9aec1ac4db8ddf498530df731d58cd6178720dc184eed594e35e69039: Status 404 returned error can't find the container with id 45ae09c9aec1ac4db8ddf498530df731d58cd6178720dc184eed594e35e69039 Apr 24 22:32:15.430105 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.429940 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv"] Apr 24 22:32:15.432807 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:15.432780 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfcbbed3_11b9_4b33_aa07_519bf4877cdd.slice/crio-c058be5d58d5b5ca194a0f610ffc89805df7a6bc60014f94bdfca5daeb52f53d WatchSource:0}: Error finding container c058be5d58d5b5ca194a0f610ffc89805df7a6bc60014f94bdfca5daeb52f53d: Status 404 returned error can't find the container with id c058be5d58d5b5ca194a0f610ffc89805df7a6bc60014f94bdfca5daeb52f53d Apr 24 22:32:15.489715 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.489689 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:15.489849 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.489819 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:15.489913 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.489891 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls podName:544ed2db-d8bb-44fd-824b-848a3cc34ab7 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:16.489870434 +0000 UTC m=+142.411513689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cnh9j" (UID: "544ed2db-d8bb-44fd-824b-848a3cc34ab7") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:15.591074 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.590983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.591074 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.591012 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:32:15.591074 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.591066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:15.591281 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.591083 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:16.591066577 +0000 UTC m=+142.512709821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : secret "router-metrics-certs-default" not found Apr 24 22:32:15.591281 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:15.591148 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:16.591133738 +0000 UTC m=+142.512776991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : configmap references non-existent config key: service-ca.crt Apr 24 22:32:15.650317 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.650279 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t"] Apr 24 22:32:15.653441 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:15.653414 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod352df9b5_3d38_4772_8e28_cff124503696.slice/crio-dde82b2c1692fd390425b69bbbc26cb42718fa4c1f717b67d3272f74ab970360 WatchSource:0}: Error finding container dde82b2c1692fd390425b69bbbc26cb42718fa4c1f717b67d3272f74ab970360: Status 404 returned error can't find the container with id dde82b2c1692fd390425b69bbbc26cb42718fa4c1f717b67d3272f74ab970360 Apr 24 22:32:15.653785 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.653762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx"] Apr 24 22:32:15.657008 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:15.656975 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0783a162_f638_447b_b28a_38a88c620edb.slice/crio-6be100b1620122d8ad4f6d256eade5e39f216d72e58140c9e2aeb2e313855157 WatchSource:0}: Error finding container 6be100b1620122d8ad4f6d256eade5e39f216d72e58140c9e2aeb2e313855157: Status 404 returned error can't find the container with id 6be100b1620122d8ad4f6d256eade5e39f216d72e58140c9e2aeb2e313855157 Apr 24 22:32:15.966475 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.966419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" event={"ID":"bfcbbed3-11b9-4b33-aa07-519bf4877cdd","Type":"ContainerStarted","Data":"208eeb824b40c57f9f06fa1dddc6add9a4d492ca1e5283b859c5eff9e85e3239"} Apr 24 22:32:15.966928 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.966488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" event={"ID":"bfcbbed3-11b9-4b33-aa07-519bf4877cdd","Type":"ContainerStarted","Data":"c058be5d58d5b5ca194a0f610ffc89805df7a6bc60014f94bdfca5daeb52f53d"} Apr 24 22:32:15.969113 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.969044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" event={"ID":"0783a162-f638-447b-b28a-38a88c620edb","Type":"ContainerStarted","Data":"6be100b1620122d8ad4f6d256eade5e39f216d72e58140c9e2aeb2e313855157"} Apr 24 22:32:15.971185 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.971158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" event={"ID":"352df9b5-3d38-4772-8e28-cff124503696","Type":"ContainerStarted","Data":"dde82b2c1692fd390425b69bbbc26cb42718fa4c1f717b67d3272f74ab970360"} Apr 24 22:32:15.972915 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.972885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" event={"ID":"6ab9f5f8-559e-4176-b778-01a1a51317a4","Type":"ContainerStarted","Data":"45ae09c9aec1ac4db8ddf498530df731d58cd6178720dc184eed594e35e69039"} Apr 24 22:32:15.986801 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:15.986748 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fc9xv" podStartSLOduration=1.9867338810000001 podStartE2EDuration="1.986733881s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:15.986091932 +0000 UTC m=+141.907735193" watchObservedRunningTime="2026-04-24 22:32:15.986733881 +0000 UTC m=+141.908377143" Apr 24 22:32:16.500576 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:16.500537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:16.500819 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:16.500677 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:16.500819 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:16.500771 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls podName:544ed2db-d8bb-44fd-824b-848a3cc34ab7 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.500751028 +0000 UTC m=+144.422394275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cnh9j" (UID: "544ed2db-d8bb-44fd-824b-848a3cc34ab7") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:16.602105 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:16.602067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:16.602281 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:16.602147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:16.602281 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:16.602213 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:32:16.602495 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:16.602282 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.602260293 +0000 UTC m=+144.523903614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : secret "router-metrics-certs-default" not found Apr 24 22:32:16.602495 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:16.602317 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.602297502 +0000 UTC m=+144.523940761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : configmap references non-existent config key: service-ca.crt Apr 24 22:32:16.976493 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:16.976449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" event={"ID":"6ab9f5f8-559e-4176-b778-01a1a51317a4","Type":"ContainerStarted","Data":"f6d241bf414f2c228796ad0fcc2d9859c379063760b61102d7a44d354f5d251e"} Apr 24 22:32:16.994824 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:16.994695 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vx8c6" podStartSLOduration=1.596705913 podStartE2EDuration="2.9946768s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:15.334273053 +0000 UTC m=+141.255916304" lastFinishedPulling="2026-04-24 22:32:16.732243939 +0000 UTC m=+142.653887191" observedRunningTime="2026-04-24 22:32:16.994311918 +0000 UTC m=+142.915955178" watchObservedRunningTime="2026-04-24 22:32:16.9946768 +0000 UTC m=+142.916320061" Apr 24 22:32:18.521381 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:18.521341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:18.521846 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:18.521464 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:18.521846 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:18.521524 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls podName:544ed2db-d8bb-44fd-824b-848a3cc34ab7 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.52150988 +0000 UTC m=+148.443153130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cnh9j" (UID: "544ed2db-d8bb-44fd-824b-848a3cc34ab7") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:18.622108 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:18.622066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:18.622306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:18.622178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:18.622306 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:18.622270 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.622229149 +0000 UTC m=+148.543872395 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : configmap references non-existent config key: service-ca.crt Apr 24 22:32:18.622306 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:18.622289 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:32:18.622473 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:18.622335 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.622323912 +0000 UTC m=+148.543967150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : secret "router-metrics-certs-default" not found Apr 24 22:32:18.980730 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:18.980673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" event={"ID":"0783a162-f638-447b-b28a-38a88c620edb","Type":"ContainerStarted","Data":"bdc0e8caa3772428d351338c1dc8d37f464bae1250d596faf7bae8d7f1d54479"} Apr 24 22:32:18.981995 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:18.981968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" event={"ID":"352df9b5-3d38-4772-8e28-cff124503696","Type":"ContainerStarted","Data":"599eb603eb1a29585bd40d55db8d95dcfd11b0947d5cc939d9573b2436059604"} Apr 24 22:32:19.002651 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:19.002580 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" podStartSLOduration=2.710743323 podStartE2EDuration="5.00256559s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:15.65872749 +0000 UTC m=+141.580370743" lastFinishedPulling="2026-04-24 22:32:17.950549772 +0000 UTC m=+143.872193010" observedRunningTime="2026-04-24 22:32:19.001935132 +0000 UTC m=+144.923578395" watchObservedRunningTime="2026-04-24 22:32:19.00256559 +0000 UTC m=+144.924208850" Apr 24 22:32:19.018792 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:19.018729 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" podStartSLOduration=2.721118983 podStartE2EDuration="5.018708585s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:15.65545406 +0000 UTC m=+141.577097311" lastFinishedPulling="2026-04-24 22:32:17.95304367 +0000 UTC m=+143.874686913" observedRunningTime="2026-04-24 22:32:19.017317418 +0000 UTC m=+144.938960694" watchObservedRunningTime="2026-04-24 22:32:19.018708585 +0000 UTC m=+144.940351845" Apr 24 22:32:21.969215 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:21.969186 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-72jb2_b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f/dns-node-resolver/0.log" Apr 24 22:32:22.012794 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.012760 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m5qjn"] Apr 24 22:32:22.016887 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.016868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.019953 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.019930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 22:32:22.020134 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.020119 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-4nphp\"" Apr 24 22:32:22.021009 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.020991 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 22:32:22.021090 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.021074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 22:32:22.021137 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.021117 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 22:32:22.031841 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.031823 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m5qjn"] Apr 24 22:32:22.149492 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.149453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f391fe56-72c1-4ab4-be51-9aabcfdf804e-signing-key\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.149698 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.149515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f391fe56-72c1-4ab4-be51-9aabcfdf804e-signing-cabundle\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.149698 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.149611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j285j\" (UniqueName: \"kubernetes.io/projected/f391fe56-72c1-4ab4-be51-9aabcfdf804e-kube-api-access-j285j\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.250744 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.250653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f391fe56-72c1-4ab4-be51-9aabcfdf804e-signing-key\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.250744 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.250706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f391fe56-72c1-4ab4-be51-9aabcfdf804e-signing-cabundle\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.250744 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.250747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j285j\" (UniqueName: \"kubernetes.io/projected/f391fe56-72c1-4ab4-be51-9aabcfdf804e-kube-api-access-j285j\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.251440 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.251420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f391fe56-72c1-4ab4-be51-9aabcfdf804e-signing-cabundle\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.253003 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.252984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f391fe56-72c1-4ab4-be51-9aabcfdf804e-signing-key\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.262497 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.262476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j285j\" (UniqueName: \"kubernetes.io/projected/f391fe56-72c1-4ab4-be51-9aabcfdf804e-kube-api-access-j285j\") pod \"service-ca-865cb79987-m5qjn\" (UID: \"f391fe56-72c1-4ab4-be51-9aabcfdf804e\") " pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.325334 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.325297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-m5qjn" Apr 24 22:32:22.447273 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.447240 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m5qjn"] Apr 24 22:32:22.450296 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:22.450271 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf391fe56_72c1_4ab4_be51_9aabcfdf804e.slice/crio-fa45304154e146f75cb6a515b0562ab47af99b62368748fe5b2e83415813b6f4 WatchSource:0}: Error finding container fa45304154e146f75cb6a515b0562ab47af99b62368748fe5b2e83415813b6f4: Status 404 returned error can't find the container with id fa45304154e146f75cb6a515b0562ab47af99b62368748fe5b2e83415813b6f4 Apr 24 22:32:22.552544 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.552509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:22.552748 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:22.552638 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:22.552748 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:22.552702 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls podName:544ed2db-d8bb-44fd-824b-848a3cc34ab7 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:30.552688442 +0000 UTC m=+156.474331680 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cnh9j" (UID: "544ed2db-d8bb-44fd-824b-848a3cc34ab7") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:22.653789 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.653748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:22.653997 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.653850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:22.653997 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:22.653918 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:30.653885897 +0000 UTC m=+156.575529141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : configmap references non-existent config key: service-ca.crt Apr 24 22:32:22.653997 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:22.653965 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:32:22.654155 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:22.654006 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs podName:4f41768b-12df-4ac4-ab57-68ec0bada16d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:30.653993486 +0000 UTC m=+156.575636724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs") pod "router-default-86466888b4-mw72x" (UID: "4f41768b-12df-4ac4-ab57-68ec0bada16d") : secret "router-metrics-certs-default" not found Apr 24 22:32:22.990666 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.990630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-m5qjn" event={"ID":"f391fe56-72c1-4ab4-be51-9aabcfdf804e","Type":"ContainerStarted","Data":"8eb870231a26f32075119aca0d38e79ad842642e34765f56a9c63eca19aec2ab"} Apr 24 22:32:22.990666 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:22.990667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-m5qjn" event={"ID":"f391fe56-72c1-4ab4-be51-9aabcfdf804e","Type":"ContainerStarted","Data":"fa45304154e146f75cb6a515b0562ab47af99b62368748fe5b2e83415813b6f4"} Apr 24 22:32:23.009755 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:23.009706 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-m5qjn" podStartSLOduration=2.009690977 podStartE2EDuration="2.009690977s" podCreationTimestamp="2026-04-24 22:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:23.009122998 +0000 UTC m=+148.930766261" watchObservedRunningTime="2026-04-24 22:32:23.009690977 +0000 UTC m=+148.931334237" Apr 24 22:32:23.159549 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:23.159523 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ltpdf_993dc3a3-4c4d-4d45-92a5-a952464091dc/node-ca/0.log" Apr 24 22:32:24.764921 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:24.761262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t2xzx_0783a162-f638-447b-b28a-38a88c620edb/kube-storage-version-migrator-operator/0.log" Apr 24 22:32:30.030314 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:30.030268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rmzcn" podUID="234a1a93-e9ac-4d93-9c1a-57d82a34f0fb" Apr 24 22:32:30.043496 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:30.043474 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9ljzs" podUID="de7eead8-356f-4ed5-a05e-ca346be1cd7c" Apr 24 22:32:30.616798 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.616749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:30.616978 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:30.616913 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:30.617020 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:30.616986 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls podName:544ed2db-d8bb-44fd-824b-848a3cc34ab7 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:46.616969206 +0000 UTC m=+172.538612446 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cnh9j" (UID: "544ed2db-d8bb-44fd-824b-848a3cc34ab7") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:30.717357 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.717316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:30.717527 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.717423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:30.717922 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.717901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f41768b-12df-4ac4-ab57-68ec0bada16d-service-ca-bundle\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:30.719697 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.719678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f41768b-12df-4ac4-ab57-68ec0bada16d-metrics-certs\") pod \"router-default-86466888b4-mw72x\" (UID: \"4f41768b-12df-4ac4-ab57-68ec0bada16d\") " pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:30.806795 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.806753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:30.927798 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:30.927767 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86466888b4-mw72x"] Apr 24 22:32:30.931437 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:30.931404 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f41768b_12df_4ac4_ab57_68ec0bada16d.slice/crio-3695cffdd4e6c7ec0fa73e48fc16b7e59a9d9d6e32742c55e844a3826cc7e51a WatchSource:0}: Error finding container 3695cffdd4e6c7ec0fa73e48fc16b7e59a9d9d6e32742c55e844a3826cc7e51a: Status 404 returned error can't find the container with id 3695cffdd4e6c7ec0fa73e48fc16b7e59a9d9d6e32742c55e844a3826cc7e51a Apr 24 22:32:31.016333 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.016305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:32:31.016333 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.016307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86466888b4-mw72x" event={"ID":"4f41768b-12df-4ac4-ab57-68ec0bada16d","Type":"ContainerStarted","Data":"392ac94f079aa6b7129db208fe70f66b0d8ad1a9f1f5f2736af16f236681d6ce"} Apr 24 22:32:31.016333 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.016345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86466888b4-mw72x" event={"ID":"4f41768b-12df-4ac4-ab57-68ec0bada16d","Type":"ContainerStarted","Data":"3695cffdd4e6c7ec0fa73e48fc16b7e59a9d9d6e32742c55e844a3826cc7e51a"} Apr 24 22:32:31.016632 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.016506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9ljzs" Apr 24 22:32:31.038349 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.038293 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-86466888b4-mw72x" podStartSLOduration=17.03827414 podStartE2EDuration="17.03827414s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:31.036983072 +0000 UTC m=+156.958626355" watchObservedRunningTime="2026-04-24 22:32:31.03827414 +0000 UTC m=+156.959917405" Apr 24 22:32:31.574371 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:32:31.574313 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6h7k6" podUID="4d5279a2-c42c-42b0-a00f-df176466bd90" Apr 24 22:32:31.807630 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.807565 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:31.810274 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:31.810252 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:32.019121 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:32.019091 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:32.020429 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:32.020409 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-86466888b4-mw72x" Apr 24 22:32:34.953165 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:34.953121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:32:34.953165 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:34.953160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:32:34.955535 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:34.955510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/234a1a93-e9ac-4d93-9c1a-57d82a34f0fb-cert\") pod \"ingress-canary-rmzcn\" (UID: \"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb\") " pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:32:34.955535 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:34.955523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de7eead8-356f-4ed5-a05e-ca346be1cd7c-metrics-tls\") pod \"dns-default-9ljzs\" (UID: \"de7eead8-356f-4ed5-a05e-ca346be1cd7c\") " pod="openshift-dns/dns-default-9ljzs" Apr 24 22:32:35.220436 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:35.220340 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zzr8g\"" Apr 24 22:32:35.221255 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:35.221237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jqx86\"" Apr 24 22:32:35.228143 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:35.228123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9ljzs" Apr 24 22:32:35.228216 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:35.228190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmzcn" Apr 24 22:32:35.355756 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:35.355725 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rmzcn"] Apr 24 22:32:35.359536 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:35.359506 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234a1a93_e9ac_4d93_9c1a_57d82a34f0fb.slice/crio-f0abff5a74ab48fff0a13cc2ab42546190910bf98ba1594f8def46cfaac14662 WatchSource:0}: Error finding container f0abff5a74ab48fff0a13cc2ab42546190910bf98ba1594f8def46cfaac14662: Status 404 returned error can't find the container with id f0abff5a74ab48fff0a13cc2ab42546190910bf98ba1594f8def46cfaac14662 Apr 24 22:32:35.370997 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:35.370970 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9ljzs"] Apr 24 22:32:35.375048 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:35.375023 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eead8_356f_4ed5_a05e_ca346be1cd7c.slice/crio-87ba93149db728d45c6dec7aebd138c4c3bffbb58aaf318bc89ca18de95721fa WatchSource:0}: Error finding container 87ba93149db728d45c6dec7aebd138c4c3bffbb58aaf318bc89ca18de95721fa: Status 404 returned error can't find the container with id 87ba93149db728d45c6dec7aebd138c4c3bffbb58aaf318bc89ca18de95721fa Apr 24 22:32:36.031767 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:36.031717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rmzcn" event={"ID":"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb","Type":"ContainerStarted","Data":"f0abff5a74ab48fff0a13cc2ab42546190910bf98ba1594f8def46cfaac14662"} Apr 24 22:32:36.033302 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:36.033274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9ljzs" event={"ID":"de7eead8-356f-4ed5-a05e-ca346be1cd7c","Type":"ContainerStarted","Data":"87ba93149db728d45c6dec7aebd138c4c3bffbb58aaf318bc89ca18de95721fa"} Apr 24 22:32:38.039459 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:38.039422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rmzcn" event={"ID":"234a1a93-e9ac-4d93-9c1a-57d82a34f0fb","Type":"ContainerStarted","Data":"03cc925efd1fee19d52ce876aff8b987bea64c2e46159988b35fd4b8f2fd8faf"} Apr 24 22:32:38.041008 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:38.040985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9ljzs" event={"ID":"de7eead8-356f-4ed5-a05e-ca346be1cd7c","Type":"ContainerStarted","Data":"f7e08a165bebae4d869f3eb1c660d105b310db5d5b3f9194483ba3e3305adc73"} Apr 24 22:32:38.041112 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:38.041011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9ljzs" event={"ID":"de7eead8-356f-4ed5-a05e-ca346be1cd7c","Type":"ContainerStarted","Data":"596a8de737fd08d8959f1b2f67c8b78e58f3b5261fe39fbfe00857ff77ac1e8c"} Apr 24 22:32:38.041112 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:38.041094 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9ljzs" Apr 24 22:32:38.057429 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:38.057386 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rmzcn" podStartSLOduration=130.210938608 podStartE2EDuration="2m12.057373451s" podCreationTimestamp="2026-04-24 22:30:26 +0000 UTC" firstStartedPulling="2026-04-24 22:32:35.361360522 +0000 UTC m=+161.283003761" lastFinishedPulling="2026-04-24 22:32:37.207795366 +0000 UTC m=+163.129438604" observedRunningTime="2026-04-24 22:32:38.056896689 +0000 UTC m=+163.978539960" watchObservedRunningTime="2026-04-24 22:32:38.057373451 +0000 UTC m=+163.979016711" Apr 24 22:32:38.076821 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:38.076767 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9ljzs" podStartSLOduration=130.249464631 podStartE2EDuration="2m12.076751104s" podCreationTimestamp="2026-04-24 22:30:26 +0000 UTC" firstStartedPulling="2026-04-24 22:32:35.376826227 +0000 UTC m=+161.298469469" lastFinishedPulling="2026-04-24 22:32:37.204112692 +0000 UTC m=+163.125755942" observedRunningTime="2026-04-24 22:32:38.076056036 +0000 UTC m=+163.997699318" watchObservedRunningTime="2026-04-24 22:32:38.076751104 +0000 UTC m=+163.998394367" Apr 24 22:32:41.824972 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.824934 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l227c"] Apr 24 22:32:41.828331 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.828309 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:41.830916 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.830895 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:32:41.831031 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.830898 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:32:41.832042 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.832023 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-f6lt5\"" Apr 24 22:32:41.832042 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.832034 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:32:41.832176 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.832022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:32:41.840989 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.840968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l227c"] Apr 24 22:32:41.894064 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.894029 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-78549dfc96-psndh"] Apr 24 22:32:41.897382 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.897358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.900423 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.900402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:32:41.900579 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.900402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:32:41.900579 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.900405 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:32:41.900579 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.900510 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9rsh7\"" Apr 24 22:32:41.902956 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.902932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxdwf\" (UniqueName: \"kubernetes.io/projected/9b9c0d29-345d-4837-b76a-3bb9fd546efa-kube-api-access-xxdwf\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:41.903078 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.902971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1881c44-a3c0-45f9-955d-c9d358c76849-ca-trust-extracted\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903078 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b9c0d29-345d-4837-b76a-3bb9fd546efa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:41.903159 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1881c44-a3c0-45f9-955d-c9d358c76849-trusted-ca\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903159 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b9c0d29-345d-4837-b76a-3bb9fd546efa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:41.903227 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1881c44-a3c0-45f9-955d-c9d358c76849-image-registry-private-configuration\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903227 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1881c44-a3c0-45f9-955d-c9d358c76849-registry-certificates\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903300 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b9c0d29-345d-4837-b76a-3bb9fd546efa-data-volume\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:41.903300 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-bound-sa-token\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903300 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b9c0d29-345d-4837-b76a-3bb9fd546efa-crio-socket\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:41.903441 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skcxg\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-kube-api-access-skcxg\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903441 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-registry-tls\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.903441 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.903433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1881c44-a3c0-45f9-955d-c9d358c76849-installation-pull-secrets\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:41.908749 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.908680 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:32:41.908749 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:41.908711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78549dfc96-psndh"] Apr 24 22:32:42.004279 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1881c44-a3c0-45f9-955d-c9d358c76849-trusted-ca\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004279 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b9c0d29-345d-4837-b76a-3bb9fd546efa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.004509 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1881c44-a3c0-45f9-955d-c9d358c76849-image-registry-private-configuration\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004509 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1881c44-a3c0-45f9-955d-c9d358c76849-registry-certificates\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004509 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b9c0d29-345d-4837-b76a-3bb9fd546efa-data-volume\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.004684 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-bound-sa-token\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004684 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b9c0d29-345d-4837-b76a-3bb9fd546efa-crio-socket\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.004684 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skcxg\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-kube-api-access-skcxg\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004684 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-registry-tls\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004684 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b9c0d29-345d-4837-b76a-3bb9fd546efa-crio-socket\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.004928 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1881c44-a3c0-45f9-955d-c9d358c76849-installation-pull-secrets\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.004928 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxdwf\" (UniqueName: \"kubernetes.io/projected/9b9c0d29-345d-4837-b76a-3bb9fd546efa-kube-api-access-xxdwf\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.004928 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.004894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1881c44-a3c0-45f9-955d-c9d358c76849-ca-trust-extracted\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.005172 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.005152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1881c44-a3c0-45f9-955d-c9d358c76849-ca-trust-extracted\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.005250 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.005227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b9c0d29-345d-4837-b76a-3bb9fd546efa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.005352 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.005333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b9c0d29-345d-4837-b76a-3bb9fd546efa-data-volume\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.005410 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.005364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1881c44-a3c0-45f9-955d-c9d358c76849-trusted-ca\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.005410 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.005364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1881c44-a3c0-45f9-955d-c9d358c76849-registry-certificates\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.005512 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.005492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b9c0d29-345d-4837-b76a-3bb9fd546efa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.007263 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.007239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-registry-tls\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.007478 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.007462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b9c0d29-345d-4837-b76a-3bb9fd546efa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.007526 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.007476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1881c44-a3c0-45f9-955d-c9d358c76849-installation-pull-secrets\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.007606 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.007576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1881c44-a3c0-45f9-955d-c9d358c76849-image-registry-private-configuration\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.014273 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.014248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxdwf\" (UniqueName: \"kubernetes.io/projected/9b9c0d29-345d-4837-b76a-3bb9fd546efa-kube-api-access-xxdwf\") pod \"insights-runtime-extractor-l227c\" (UID: \"9b9c0d29-345d-4837-b76a-3bb9fd546efa\") " pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.014741 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.014720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-bound-sa-token\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.014862 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.014838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skcxg\" (UniqueName: \"kubernetes.io/projected/b1881c44-a3c0-45f9-955d-c9d358c76849-kube-api-access-skcxg\") pod \"image-registry-78549dfc96-psndh\" (UID: \"b1881c44-a3c0-45f9-955d-c9d358c76849\") " pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.137883 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.137798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l227c" Apr 24 22:32:42.208130 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.207805 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:42.257295 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.257232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l227c"] Apr 24 22:32:42.261722 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:42.261668 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b9c0d29_345d_4837_b76a_3bb9fd546efa.slice/crio-f755599005d2d9593700594650abc865c37a00f69a90346d2c29bb8577f1f572 WatchSource:0}: Error finding container f755599005d2d9593700594650abc865c37a00f69a90346d2c29bb8577f1f572: Status 404 returned error can't find the container with id f755599005d2d9593700594650abc865c37a00f69a90346d2c29bb8577f1f572 Apr 24 22:32:42.331514 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:42.331478 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78549dfc96-psndh"] Apr 24 22:32:42.336646 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:42.336617 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1881c44_a3c0_45f9_955d_c9d358c76849.slice/crio-dcbdcc42347ccb24c3d1560647e96ec67f4ebe9831eb27dab16112675ab5fdd0 WatchSource:0}: Error finding container dcbdcc42347ccb24c3d1560647e96ec67f4ebe9831eb27dab16112675ab5fdd0: Status 404 returned error can't find the container with id dcbdcc42347ccb24c3d1560647e96ec67f4ebe9831eb27dab16112675ab5fdd0 Apr 24 22:32:43.057102 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.057067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78549dfc96-psndh" event={"ID":"b1881c44-a3c0-45f9-955d-c9d358c76849","Type":"ContainerStarted","Data":"d74a8343929e76fa1f6a05dd560f04a45a85cf62b2ac68168e6f77afe093188a"} Apr 24 22:32:43.057102 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.057104 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78549dfc96-psndh" event={"ID":"b1881c44-a3c0-45f9-955d-c9d358c76849","Type":"ContainerStarted","Data":"dcbdcc42347ccb24c3d1560647e96ec67f4ebe9831eb27dab16112675ab5fdd0"} Apr 24 22:32:43.057654 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.057135 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:32:43.058428 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.058406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l227c" event={"ID":"9b9c0d29-345d-4837-b76a-3bb9fd546efa","Type":"ContainerStarted","Data":"572aae415cd29cac669410b2fff5e31f870c8f694919f4987c57ad0a55924a1e"} Apr 24 22:32:43.058507 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.058431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l227c" event={"ID":"9b9c0d29-345d-4837-b76a-3bb9fd546efa","Type":"ContainerStarted","Data":"3b267a94a2291493c745711253e19c9d95cb9688cddb3d5513fd16945462f583"} Apr 24 22:32:43.058507 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.058440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l227c" event={"ID":"9b9c0d29-345d-4837-b76a-3bb9fd546efa","Type":"ContainerStarted","Data":"f755599005d2d9593700594650abc865c37a00f69a90346d2c29bb8577f1f572"} Apr 24 22:32:43.079457 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:43.079357 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-78549dfc96-psndh" podStartSLOduration=2.079343063 podStartE2EDuration="2.079343063s" podCreationTimestamp="2026-04-24 22:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:43.07764764 +0000 UTC m=+168.999290899" watchObservedRunningTime="2026-04-24 22:32:43.079343063 +0000 UTC m=+169.000986322" Apr 24 22:32:45.065746 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:45.065711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l227c" event={"ID":"9b9c0d29-345d-4837-b76a-3bb9fd546efa","Type":"ContainerStarted","Data":"0b059c31aa44ef4c8d1a4235b065fe103e3b8f2836c9f53778739a558298a443"} Apr 24 22:32:45.084339 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:45.084285 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l227c" podStartSLOduration=2.217602868 podStartE2EDuration="4.084270873s" podCreationTimestamp="2026-04-24 22:32:41 +0000 UTC" firstStartedPulling="2026-04-24 22:32:42.316863613 +0000 UTC m=+168.238506856" lastFinishedPulling="2026-04-24 22:32:44.18353162 +0000 UTC m=+170.105174861" observedRunningTime="2026-04-24 22:32:45.083434895 +0000 UTC m=+171.005078155" watchObservedRunningTime="2026-04-24 22:32:45.084270873 +0000 UTC m=+171.005914133" Apr 24 22:32:46.561779 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:46.561742 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:32:46.647268 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:46.647227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:46.649660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:46.649628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/544ed2db-d8bb-44fd-824b-848a3cc34ab7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cnh9j\" (UID: \"544ed2db-d8bb-44fd-824b-848a3cc34ab7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:46.903791 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:46.903712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" Apr 24 22:32:47.017810 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:47.017673 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j"] Apr 24 22:32:47.020097 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:47.020072 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod544ed2db_d8bb_44fd_824b_848a3cc34ab7.slice/crio-ec0afb43970db54f323ea1062eee6cf35ff49e97c761b9362a52ec171de8044c WatchSource:0}: Error finding container ec0afb43970db54f323ea1062eee6cf35ff49e97c761b9362a52ec171de8044c: Status 404 returned error can't find the container with id ec0afb43970db54f323ea1062eee6cf35ff49e97c761b9362a52ec171de8044c Apr 24 22:32:47.074679 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:47.074646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" event={"ID":"544ed2db-d8bb-44fd-824b-848a3cc34ab7","Type":"ContainerStarted","Data":"ec0afb43970db54f323ea1062eee6cf35ff49e97c761b9362a52ec171de8044c"} Apr 24 22:32:48.046413 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:48.046380 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9ljzs" Apr 24 22:32:49.085062 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.085024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" event={"ID":"544ed2db-d8bb-44fd-824b-848a3cc34ab7","Type":"ContainerStarted","Data":"2e38548781834c47cef474ee7c0b7b5512d28a47239463fa7c7ecd67119ffa65"} Apr 24 22:32:49.103479 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.103452 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l"] Apr 24 22:32:49.106513 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.106485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:49.107144 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.107105 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cnh9j" podStartSLOduration=33.530795355 podStartE2EDuration="35.107091955s" podCreationTimestamp="2026-04-24 22:32:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:47.02196636 +0000 UTC m=+172.943609599" lastFinishedPulling="2026-04-24 22:32:48.598262957 +0000 UTC m=+174.519906199" observedRunningTime="2026-04-24 22:32:49.105887435 +0000 UTC m=+175.027530697" watchObservedRunningTime="2026-04-24 22:32:49.107091955 +0000 UTC m=+175.028735215" Apr 24 22:32:49.108940 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.108916 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 22:32:49.109054 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.108966 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-5mtx4\"" Apr 24 22:32:49.114116 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.114094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l"] Apr 24 22:32:49.169716 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.169679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d9944c42-61bb-48d9-8504-dcf1430a7af5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q9w7l\" (UID: \"d9944c42-61bb-48d9-8504-dcf1430a7af5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:49.270994 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.270938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d9944c42-61bb-48d9-8504-dcf1430a7af5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q9w7l\" (UID: \"d9944c42-61bb-48d9-8504-dcf1430a7af5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:49.273444 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.273408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d9944c42-61bb-48d9-8504-dcf1430a7af5-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q9w7l\" (UID: \"d9944c42-61bb-48d9-8504-dcf1430a7af5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:49.415834 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.415744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:49.528128 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:49.528092 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l"] Apr 24 22:32:49.531207 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:32:49.531181 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9944c42_61bb_48d9_8504_dcf1430a7af5.slice/crio-ae71fe36ce90a684affb88d6d9f6a1ca3234d2a2143ececbf167e81445af67ec WatchSource:0}: Error finding container ae71fe36ce90a684affb88d6d9f6a1ca3234d2a2143ececbf167e81445af67ec: Status 404 returned error can't find the container with id ae71fe36ce90a684affb88d6d9f6a1ca3234d2a2143ececbf167e81445af67ec Apr 24 22:32:50.088795 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:50.088720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" event={"ID":"d9944c42-61bb-48d9-8504-dcf1430a7af5","Type":"ContainerStarted","Data":"ae71fe36ce90a684affb88d6d9f6a1ca3234d2a2143ececbf167e81445af67ec"} Apr 24 22:32:51.092687 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:51.092651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" event={"ID":"d9944c42-61bb-48d9-8504-dcf1430a7af5","Type":"ContainerStarted","Data":"60969e52cf056d79a904eb0e36c14b64a1f8f201772d277d07673ca913cd3780"} Apr 24 22:32:51.093142 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:51.092849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:51.097446 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:51.097424 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" Apr 24 22:32:51.109255 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:32:51.109214 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q9w7l" podStartSLOduration=1.241000512 podStartE2EDuration="2.109200618s" podCreationTimestamp="2026-04-24 22:32:49 +0000 UTC" firstStartedPulling="2026-04-24 22:32:49.533422167 +0000 UTC m=+175.455065407" lastFinishedPulling="2026-04-24 22:32:50.40162226 +0000 UTC m=+176.323265513" observedRunningTime="2026-04-24 22:32:51.108522768 +0000 UTC m=+177.030166029" watchObservedRunningTime="2026-04-24 22:32:51.109200618 +0000 UTC m=+177.030843918" Apr 24 22:33:02.549628 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.549580 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cbb7c"] Apr 24 22:33:02.554892 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.554860 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srwcn"] Apr 24 22:33:02.555052 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.555034 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.557917 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.557878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:33:02.557917 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.557890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6nxsq\"" Apr 24 22:33:02.558233 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.558212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.558361 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.558248 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:33:02.558649 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.558510 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:33:02.558649 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.558575 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:33:02.560741 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.560722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:33:02.561229 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.561207 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 22:33:02.562215 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.562183 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-jvlp6\"" Apr 24 22:33:02.562974 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.562951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 22:33:02.567010 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.566988 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srwcn"] Apr 24 22:33:02.673144 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-accelerators-collector-config\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673322 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-root\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673322 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.673322 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-textfile\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673322 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvpd\" (UniqueName: \"kubernetes.io/projected/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-api-access-8rvpd\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.673559 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.673559 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.673559 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-wtmp\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673559 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.673788 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-tls\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673788 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673788 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzg6\" (UniqueName: \"kubernetes.io/projected/fb1c64b1-1c7e-4825-9e30-821260908c4b-kube-api-access-bzzg6\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673788 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-sys\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673788 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb1c64b1-1c7e-4825-9e30-821260908c4b-metrics-client-ca\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.673976 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.673824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.774479 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.774479 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-textfile\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.774784 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvpd\" (UniqueName: \"kubernetes.io/projected/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-api-access-8rvpd\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.774784 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.774784 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.774784 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-wtmp\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.774991 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-wtmp\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.774991 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.774991 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-textfile\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.774991 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-tls\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.774991 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.774922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzg6\" (UniqueName: \"kubernetes.io/projected/fb1c64b1-1c7e-4825-9e30-821260908c4b-kube-api-access-bzzg6\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-sys\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb1c64b1-1c7e-4825-9e30-821260908c4b-metrics-client-ca\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-sys\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-accelerators-collector-config\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-root\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775671 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fb1c64b1-1c7e-4825-9e30-821260908c4b-root\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775671 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.775671 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.775819 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-accelerators-collector-config\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.775954 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.775906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb1c64b1-1c7e-4825-9e30-821260908c4b-metrics-client-ca\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.777678 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.777656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.777847 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.777828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.777976 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.777961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb1c64b1-1c7e-4825-9e30-821260908c4b-node-exporter-tls\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.778205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.778188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.791616 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.791574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvpd\" (UniqueName: \"kubernetes.io/projected/7dc08650-dc67-4bbf-adc1-df7bb1a6d15b-kube-api-access-8rvpd\") pod \"kube-state-metrics-69db897b98-srwcn\" (UID: \"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.792719 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.792695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzg6\" (UniqueName: \"kubernetes.io/projected/fb1c64b1-1c7e-4825-9e30-821260908c4b-kube-api-access-bzzg6\") pod \"node-exporter-cbb7c\" (UID: \"fb1c64b1-1c7e-4825-9e30-821260908c4b\") " pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.870514 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.870424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cbb7c" Apr 24 22:33:02.879089 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:02.878726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" Apr 24 22:33:02.883389 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:33:02.883364 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1c64b1_1c7e_4825_9e30_821260908c4b.slice/crio-afa0448797383596758f86685aaa115b92e27f87eab8463e999eb8b33568bdd0 WatchSource:0}: Error finding container afa0448797383596758f86685aaa115b92e27f87eab8463e999eb8b33568bdd0: Status 404 returned error can't find the container with id afa0448797383596758f86685aaa115b92e27f87eab8463e999eb8b33568bdd0 Apr 24 22:33:03.011043 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.011005 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srwcn"] Apr 24 22:33:03.014113 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:33:03.014083 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dc08650_dc67_4bbf_adc1_df7bb1a6d15b.slice/crio-8fc4bea7a92f2097ae334184e201a9edeaa1ceae4f926ae87c299e2f8fcaaa34 WatchSource:0}: Error finding container 8fc4bea7a92f2097ae334184e201a9edeaa1ceae4f926ae87c299e2f8fcaaa34: Status 404 returned error can't find the container with id 8fc4bea7a92f2097ae334184e201a9edeaa1ceae4f926ae87c299e2f8fcaaa34 Apr 24 22:33:03.123947 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.123848 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" event={"ID":"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b","Type":"ContainerStarted","Data":"8fc4bea7a92f2097ae334184e201a9edeaa1ceae4f926ae87c299e2f8fcaaa34"} Apr 24 22:33:03.124786 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.124759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cbb7c" event={"ID":"fb1c64b1-1c7e-4825-9e30-821260908c4b","Type":"ContainerStarted","Data":"afa0448797383596758f86685aaa115b92e27f87eab8463e999eb8b33568bdd0"} Apr 24 22:33:03.610209 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.610163 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:03.614121 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.614093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.616571 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.616533 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 22:33:03.616803 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.616781 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 22:33:03.616909 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.616829 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 22:33:03.616909 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.616891 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 22:33:03.617319 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.617261 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 22:33:03.617427 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.617318 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 22:33:03.617427 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.617270 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7wqld\"" Apr 24 22:33:03.617847 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.617654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 22:33:03.617847 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.617680 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 22:33:03.618275 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.618249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 22:33:03.624949 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.624347 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:03.685745 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9912da88-6cdd-473c-855f-ae7c8dc4302a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9912da88-6cdd-473c-855f-ae7c8dc4302a-config-out\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9lw2\" (UniqueName: \"kubernetes.io/projected/9912da88-6cdd-473c-855f-ae7c8dc4302a-kube-api-access-d9lw2\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-web-config\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.685912 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.686286 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.685938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.686286 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.686002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-config-volume\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.686286 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.686023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.686428 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.686389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.786875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.786933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.786964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.786990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9912da88-6cdd-473c-855f-ae7c8dc4302a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9912da88-6cdd-473c-855f-ae7c8dc4302a-config-out\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9lw2\" (UniqueName: \"kubernetes.io/projected/9912da88-6cdd-473c-855f-ae7c8dc4302a-kube-api-access-d9lw2\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-web-config\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-config-volume\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:33:03.787754 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:33:03.787822 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-main-tls podName:9912da88-6cdd-473c-855f-ae7c8dc4302a nodeName:}" failed. No retries permitted until 2026-04-24 22:33:04.287802651 +0000 UTC m=+190.209445892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "9912da88-6cdd-473c-855f-ae7c8dc4302a") : secret "alertmanager-main-tls" not found Apr 24 22:33:03.788013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.787827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.788913 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:33:03.788425 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-trusted-ca-bundle podName:9912da88-6cdd-473c-855f-ae7c8dc4302a nodeName:}" failed. No retries permitted until 2026-04-24 22:33:04.288404274 +0000 UTC m=+190.210047518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "9912da88-6cdd-473c-855f-ae7c8dc4302a") : configmap references non-existent config key: ca-bundle.crt Apr 24 22:33:03.789480 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.789437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.792516 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.792446 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.792516 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.792477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.793008 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.792966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9912da88-6cdd-473c-855f-ae7c8dc4302a-config-out\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.795985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.795164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.795985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.795581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.795985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.795944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-web-config\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.796609 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.796573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9912da88-6cdd-473c-855f-ae7c8dc4302a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.797321 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.797284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-config-volume\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:03.799677 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:03.799654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9lw2\" (UniqueName: \"kubernetes.io/projected/9912da88-6cdd-473c-855f-ae7c8dc4302a-kube-api-access-d9lw2\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:04.066796 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.066755 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-78549dfc96-psndh" Apr 24 22:33:04.129695 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.129660 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb1c64b1-1c7e-4825-9e30-821260908c4b" containerID="b1dfec40d1878429412bc5b1e20ae5eed115ceac00c34e2c4f2c3a995174331e" exitCode=0 Apr 24 22:33:04.129867 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.129754 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cbb7c" event={"ID":"fb1c64b1-1c7e-4825-9e30-821260908c4b","Type":"ContainerDied","Data":"b1dfec40d1878429412bc5b1e20ae5eed115ceac00c34e2c4f2c3a995174331e"} Apr 24 22:33:04.292395 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.292351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:04.292550 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.292427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:04.293383 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.293358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9912da88-6cdd-473c-855f-ae7c8dc4302a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:04.295997 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.295939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9912da88-6cdd-473c-855f-ae7c8dc4302a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9912da88-6cdd-473c-855f-ae7c8dc4302a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:04.528297 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.528265 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:04.681630 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:04.681476 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:04.684315 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:33:04.684290 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9912da88_6cdd_473c_855f_ae7c8dc4302a.slice/crio-15c9bdfdc0dd59e140ccf90c6bf3cdb6a32afc6e6734258d45c0200888fb4189 WatchSource:0}: Error finding container 15c9bdfdc0dd59e140ccf90c6bf3cdb6a32afc6e6734258d45c0200888fb4189: Status 404 returned error can't find the container with id 15c9bdfdc0dd59e140ccf90c6bf3cdb6a32afc6e6734258d45c0200888fb4189 Apr 24 22:33:05.133990 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.133955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"15c9bdfdc0dd59e140ccf90c6bf3cdb6a32afc6e6734258d45c0200888fb4189"} Apr 24 22:33:05.135503 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.135475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" event={"ID":"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b","Type":"ContainerStarted","Data":"dc49c9b0ed08b26f687a1e64afa972eb87a7cab3ebf48dd6e0bd5edc7e008a4b"} Apr 24 22:33:05.135636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.135508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" event={"ID":"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b","Type":"ContainerStarted","Data":"2fe69d47a2587d900512a6942e80223822489195313f3ce38060fcb4b72f2b10"} Apr 24 22:33:05.135636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.135520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" event={"ID":"7dc08650-dc67-4bbf-adc1-df7bb1a6d15b","Type":"ContainerStarted","Data":"a2787132fae3007e9ea0ab3fef5cabba145d866fee29f0279ec8efdd5cbf304d"} Apr 24 22:33:05.137186 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.137161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cbb7c" event={"ID":"fb1c64b1-1c7e-4825-9e30-821260908c4b","Type":"ContainerStarted","Data":"2c48745f761ba73c8ed01beb5d09ebc95be8e147760b1559a2013a3d3991598d"} Apr 24 22:33:05.137280 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.137190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cbb7c" event={"ID":"fb1c64b1-1c7e-4825-9e30-821260908c4b","Type":"ContainerStarted","Data":"608df9403ddd6578a0d24223930d064d39adf50603a76183e12b657a849b91c6"} Apr 24 22:33:05.173194 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.173136 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cbb7c" podStartSLOduration=2.418441695 podStartE2EDuration="3.173121355s" podCreationTimestamp="2026-04-24 22:33:02 +0000 UTC" firstStartedPulling="2026-04-24 22:33:02.884819069 +0000 UTC m=+188.806462322" lastFinishedPulling="2026-04-24 22:33:03.639498738 +0000 UTC m=+189.561141982" observedRunningTime="2026-04-24 22:33:05.172326843 +0000 UTC m=+191.093970106" watchObservedRunningTime="2026-04-24 22:33:05.173121355 +0000 UTC m=+191.094764616" Apr 24 22:33:05.173367 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.173245 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-srwcn" podStartSLOduration=1.9372436899999999 podStartE2EDuration="3.173240291s" podCreationTimestamp="2026-04-24 22:33:02 +0000 UTC" firstStartedPulling="2026-04-24 22:33:03.016105165 +0000 UTC m=+188.937748407" lastFinishedPulling="2026-04-24 22:33:04.252101759 +0000 UTC m=+190.173745008" observedRunningTime="2026-04-24 22:33:05.155806448 +0000 UTC m=+191.077449709" watchObservedRunningTime="2026-04-24 22:33:05.173240291 +0000 UTC m=+191.094883551" Apr 24 22:33:05.703494 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.703459 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx"] Apr 24 22:33:05.710610 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.710565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.713445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713198 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-udb59at203eg\"" Apr 24 22:33:05.713445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713245 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 22:33:05.713445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713292 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 22:33:05.713445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 22:33:05.713445 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713295 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 22:33:05.713863 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713629 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 22:33:05.713863 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.713684 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-z4jk4\"" Apr 24 22:33:05.717226 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.717201 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx"] Apr 24 22:33:05.806695 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-grpc-tls\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.806826 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.806826 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c870b053-b8ff-47de-8360-f05275ea8f7b-metrics-client-ca\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.806826 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.806941 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.806941 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.806941 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-tls\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.807068 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.806959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdpq\" (UniqueName: \"kubernetes.io/projected/c870b053-b8ff-47de-8360-f05275ea8f7b-kube-api-access-mxdpq\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907544 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907544 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907828 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-tls\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907828 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdpq\" (UniqueName: \"kubernetes.io/projected/c870b053-b8ff-47de-8360-f05275ea8f7b-kube-api-access-mxdpq\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907828 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-grpc-tls\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907828 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907828 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c870b053-b8ff-47de-8360-f05275ea8f7b-metrics-client-ca\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.907828 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.907734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.908643 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.908556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c870b053-b8ff-47de-8360-f05275ea8f7b-metrics-client-ca\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.910470 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.910436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.910636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.910586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-tls\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.910730 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.910686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-grpc-tls\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.910789 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.910728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.910789 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.910771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.910789 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.910780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c870b053-b8ff-47de-8360-f05275ea8f7b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:05.916102 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:05.916078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdpq\" (UniqueName: \"kubernetes.io/projected/c870b053-b8ff-47de-8360-f05275ea8f7b-kube-api-access-mxdpq\") pod \"thanos-querier-5cd8874bb9-cd8jx\" (UID: \"c870b053-b8ff-47de-8360-f05275ea8f7b\") " pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:06.023270 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:06.023173 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:06.141115 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:06.141080 2575 generic.go:358] "Generic (PLEG): container finished" podID="9912da88-6cdd-473c-855f-ae7c8dc4302a" containerID="93e67bc34027903808a4fef69fa5c650b3560d57e3852852d54b4d44aad037d6" exitCode=0 Apr 24 22:33:06.141277 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:06.141170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerDied","Data":"93e67bc34027903808a4fef69fa5c650b3560d57e3852852d54b4d44aad037d6"} Apr 24 22:33:06.150780 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:06.150759 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx"] Apr 24 22:33:06.152798 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:33:06.152762 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc870b053_b8ff_47de_8360_f05275ea8f7b.slice/crio-b1da1d4a8b2c45080180f6c5c53c49d795936db80659549acb5dd6fc4d8bb6d8 WatchSource:0}: Error finding container b1da1d4a8b2c45080180f6c5c53c49d795936db80659549acb5dd6fc4d8bb6d8: Status 404 returned error can't find the container with id b1da1d4a8b2c45080180f6c5c53c49d795936db80659549acb5dd6fc4d8bb6d8 Apr 24 22:33:07.152446 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:07.152400 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"b1da1d4a8b2c45080180f6c5c53c49d795936db80659549acb5dd6fc4d8bb6d8"} Apr 24 22:33:08.159741 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:08.159709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"cbb10aeebaafff5a37d7e86649efbbcf17fcb8b586fbeec39026742403e1d082"} Apr 24 22:33:08.162137 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:08.162113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"b042dfdd10ae239471750630678154cf3b1aa63c5c41236f29026c5d59eaafa9"} Apr 24 22:33:09.168065 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.168018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"59d7e612165e2dc2028e9091827d5d798ba6dd472914724427b09e48073c1479"} Apr 24 22:33:09.168065 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.168057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"cb748c3346357a6c2132d6421cb22b58327e852ade356cffb99695d6d65ddfe1"} Apr 24 22:33:09.168065 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.168070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"11ca1c25ffffb986a5effe932ec32fcaf725c5706c06259f8b1bd3b2cf143ef8"} Apr 24 22:33:09.168562 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.168083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"aa4735b79be9282162f6cb31302669bc4358e2f470201902bb777e7778b19226"} Apr 24 22:33:09.168562 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.168095 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9912da88-6cdd-473c-855f-ae7c8dc4302a","Type":"ContainerStarted","Data":"f18b19dfe25eb9f330a13aaeebbc0eefa1ab71cfb04556bb59a3e34ed7112d9c"} Apr 24 22:33:09.170059 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.170036 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"b518cd2bfd9eb59cc5c6f6b825cdb1bfa9a1a7b7fa68745a878423c03a9d8ee8"} Apr 24 22:33:09.170146 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.170066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"6c420b8a70641035d472af3c773068f6cb2bcd1ab6d85c99132448ff3c4a3ced"} Apr 24 22:33:09.170146 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.170080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"a40c461fd3c07d6161d53c8128a8a3c463e2f2ddcfe36954b9f04ffe012cbfd3"} Apr 24 22:33:09.198223 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:09.198170 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.831762527 podStartE2EDuration="6.198149424s" podCreationTimestamp="2026-04-24 22:33:03 +0000 UTC" firstStartedPulling="2026-04-24 22:33:04.686686928 +0000 UTC m=+190.608330170" lastFinishedPulling="2026-04-24 22:33:09.053073824 +0000 UTC m=+194.974717067" observedRunningTime="2026-04-24 22:33:09.195367331 +0000 UTC m=+195.117010593" watchObservedRunningTime="2026-04-24 22:33:09.198149424 +0000 UTC m=+195.119792686" Apr 24 22:33:10.175227 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:10.175190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"5d968dc8367953149f7562651d4678e9bcd0dc46680c823615c5c98e4989417a"} Apr 24 22:33:10.175227 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:10.175235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" event={"ID":"c870b053-b8ff-47de-8360-f05275ea8f7b","Type":"ContainerStarted","Data":"740bc72d0166f7268b59dd6afd5019f5971d8581d0f4899a440b910598d0b48c"} Apr 24 22:33:10.175724 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:10.175701 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:10.199273 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:10.199208 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" podStartSLOduration=2.277040654 podStartE2EDuration="5.199189031s" podCreationTimestamp="2026-04-24 22:33:05 +0000 UTC" firstStartedPulling="2026-04-24 22:33:06.154541865 +0000 UTC m=+192.076185104" lastFinishedPulling="2026-04-24 22:33:09.076690242 +0000 UTC m=+194.998333481" observedRunningTime="2026-04-24 22:33:10.198885447 +0000 UTC m=+196.120528721" watchObservedRunningTime="2026-04-24 22:33:10.199189031 +0000 UTC m=+196.120832294" Apr 24 22:33:16.183858 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:16.183830 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5cd8874bb9-cd8jx" Apr 24 22:33:49.284920 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:49.284884 2575 generic.go:358] "Generic (PLEG): container finished" podID="352df9b5-3d38-4772-8e28-cff124503696" containerID="599eb603eb1a29585bd40d55db8d95dcfd11b0947d5cc939d9573b2436059604" exitCode=0 Apr 24 22:33:49.285345 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:49.284956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" event={"ID":"352df9b5-3d38-4772-8e28-cff124503696","Type":"ContainerDied","Data":"599eb603eb1a29585bd40d55db8d95dcfd11b0947d5cc939d9573b2436059604"} Apr 24 22:33:49.285345 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:49.285262 2575 scope.go:117] "RemoveContainer" containerID="599eb603eb1a29585bd40d55db8d95dcfd11b0947d5cc939d9573b2436059604" Apr 24 22:33:50.288993 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:50.288961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x7x2t" event={"ID":"352df9b5-3d38-4772-8e28-cff124503696","Type":"ContainerStarted","Data":"9e0e7dfd41c1aa62a62d963f1a061e76147a1eab867057a18c03367db11ed809"} Apr 24 22:33:54.302182 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:54.302148 2575 generic.go:358] "Generic (PLEG): container finished" podID="0783a162-f638-447b-b28a-38a88c620edb" containerID="bdc0e8caa3772428d351338c1dc8d37f464bae1250d596faf7bae8d7f1d54479" exitCode=0 Apr 24 22:33:54.302566 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:54.302201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" event={"ID":"0783a162-f638-447b-b28a-38a88c620edb","Type":"ContainerDied","Data":"bdc0e8caa3772428d351338c1dc8d37f464bae1250d596faf7bae8d7f1d54479"} Apr 24 22:33:54.302566 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:54.302485 2575 scope.go:117] "RemoveContainer" containerID="bdc0e8caa3772428d351338c1dc8d37f464bae1250d596faf7bae8d7f1d54479" Apr 24 22:33:55.306764 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:33:55.306732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t2xzx" event={"ID":"0783a162-f638-447b-b28a-38a88c620edb","Type":"ContainerStarted","Data":"9adb0a6479d133ea73195d3643a0c4c6a093ae5c31e0823d6cea433e2e1303f4"} Apr 24 22:34:06.335801 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:06.335757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:34:06.338151 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:06.338125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5279a2-c42c-42b0-a00f-df176466bd90-metrics-certs\") pod \"network-metrics-daemon-6h7k6\" (UID: \"4d5279a2-c42c-42b0-a00f-df176466bd90\") " pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:34:06.366130 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:06.366097 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h92dk\"" Apr 24 22:34:06.373502 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:06.373477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6h7k6" Apr 24 22:34:06.498057 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:06.498025 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6h7k6"] Apr 24 22:34:06.501671 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:34:06.501634 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5279a2_c42c_42b0_a00f_df176466bd90.slice/crio-7e3484eba2555688bef3588bcd7d7c083ac0f421033148f5efdf04c9e4026dda WatchSource:0}: Error finding container 7e3484eba2555688bef3588bcd7d7c083ac0f421033148f5efdf04c9e4026dda: Status 404 returned error can't find the container with id 7e3484eba2555688bef3588bcd7d7c083ac0f421033148f5efdf04c9e4026dda Apr 24 22:34:07.344491 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:07.344441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6h7k6" event={"ID":"4d5279a2-c42c-42b0-a00f-df176466bd90","Type":"ContainerStarted","Data":"7e3484eba2555688bef3588bcd7d7c083ac0f421033148f5efdf04c9e4026dda"} Apr 24 22:34:08.348890 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:08.348800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6h7k6" event={"ID":"4d5279a2-c42c-42b0-a00f-df176466bd90","Type":"ContainerStarted","Data":"63b340c91f0c1749e909ce53d6ee1639b5c06d0a6c2f03c01a6ee2cafe475195"} Apr 24 22:34:08.348890 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:08.348840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6h7k6" event={"ID":"4d5279a2-c42c-42b0-a00f-df176466bd90","Type":"ContainerStarted","Data":"0ae9fa2df78b3c37ee03e3f9cb5bc6de07f40b6df0893f3bb9a418c87b059f8c"} Apr 24 22:34:08.366107 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:08.366049 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6h7k6" podStartSLOduration=253.28588826 podStartE2EDuration="4m14.366032356s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:34:06.503644899 +0000 UTC m=+252.425288141" lastFinishedPulling="2026-04-24 22:34:07.58378899 +0000 UTC m=+253.505432237" observedRunningTime="2026-04-24 22:34:08.364379748 +0000 UTC m=+254.286023010" watchObservedRunningTime="2026-04-24 22:34:08.366032356 +0000 UTC m=+254.287675617" Apr 24 22:34:54.519904 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:34:54.519870 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:35:06.587200 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.587169 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7nh6n"] Apr 24 22:35:06.590157 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.590141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.592658 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.592636 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:35:06.597138 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.597069 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7nh6n"] Apr 24 22:35:06.726845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.726813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b64af9f1-9022-4deb-8138-e644bc894d82-kubelet-config\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.727010 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.726874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b64af9f1-9022-4deb-8138-e644bc894d82-dbus\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.727010 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.726929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b64af9f1-9022-4deb-8138-e644bc894d82-original-pull-secret\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.827917 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.827884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b64af9f1-9022-4deb-8138-e644bc894d82-dbus\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.828053 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.827929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b64af9f1-9022-4deb-8138-e644bc894d82-original-pull-secret\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.828053 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.827956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b64af9f1-9022-4deb-8138-e644bc894d82-kubelet-config\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.828120 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.828059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b64af9f1-9022-4deb-8138-e644bc894d82-kubelet-config\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.828120 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.828096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b64af9f1-9022-4deb-8138-e644bc894d82-dbus\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.830328 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.830296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b64af9f1-9022-4deb-8138-e644bc894d82-original-pull-secret\") pod \"global-pull-secret-syncer-7nh6n\" (UID: \"b64af9f1-9022-4deb-8138-e644bc894d82\") " pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:06.899470 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:06.899385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7nh6n" Apr 24 22:35:07.020661 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:07.020628 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7nh6n"] Apr 24 22:35:07.024417 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:35:07.024387 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb64af9f1_9022_4deb_8138_e644bc894d82.slice/crio-71091eddae672d49dd07cb27dec0010f8f3ce502ebd544a5a2f5c2f73641392d WatchSource:0}: Error finding container 71091eddae672d49dd07cb27dec0010f8f3ce502ebd544a5a2f5c2f73641392d: Status 404 returned error can't find the container with id 71091eddae672d49dd07cb27dec0010f8f3ce502ebd544a5a2f5c2f73641392d Apr 24 22:35:07.026163 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:07.026145 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:35:07.519567 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:07.519535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7nh6n" event={"ID":"b64af9f1-9022-4deb-8138-e644bc894d82","Type":"ContainerStarted","Data":"71091eddae672d49dd07cb27dec0010f8f3ce502ebd544a5a2f5c2f73641392d"} Apr 24 22:35:11.532177 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:11.532139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7nh6n" event={"ID":"b64af9f1-9022-4deb-8138-e644bc894d82","Type":"ContainerStarted","Data":"f89f1e11c317c4081eaa84c0e11fb65f022c99eeda4ca3c5a954a7682ef0379b"} Apr 24 22:35:11.550249 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:11.550200 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7nh6n" podStartSLOduration=1.818739533 podStartE2EDuration="5.550184092s" podCreationTimestamp="2026-04-24 22:35:06 +0000 UTC" firstStartedPulling="2026-04-24 22:35:07.026324767 +0000 UTC m=+312.947968021" lastFinishedPulling="2026-04-24 22:35:10.757769327 +0000 UTC m=+316.679412580" observedRunningTime="2026-04-24 22:35:11.549544465 +0000 UTC m=+317.471187727" watchObservedRunningTime="2026-04-24 22:35:11.550184092 +0000 UTC m=+317.471827353" Apr 24 22:35:26.583835 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.583800 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5"] Apr 24 22:35:26.587454 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.587431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.590101 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.590074 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h7f5x\"" Apr 24 22:35:26.590364 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.590350 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:35:26.591164 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.591149 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:35:26.593303 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.593282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6jq\" (UniqueName: \"kubernetes.io/projected/c0f1eedd-673d-4061-9acc-eb454788dadf-kube-api-access-qb6jq\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.593404 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.593357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.593404 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.593394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.596762 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.596738 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5"] Apr 24 22:35:26.693993 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.693953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.694166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.694003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.694166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.694059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6jq\" (UniqueName: \"kubernetes.io/projected/c0f1eedd-673d-4061-9acc-eb454788dadf-kube-api-access-qb6jq\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.694462 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.694441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.694498 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.694446 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.703072 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.703047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6jq\" (UniqueName: \"kubernetes.io/projected/c0f1eedd-673d-4061-9acc-eb454788dadf-kube-api-access-qb6jq\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:26.897379 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:26.897291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:27.027844 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:27.027816 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5"] Apr 24 22:35:27.030817 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:35:27.030778 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f1eedd_673d_4061_9acc_eb454788dadf.slice/crio-04a5d90baf084b89463cf73d5c4b23d7ecc34bfb733a3673900bb9a27b09c71a WatchSource:0}: Error finding container 04a5d90baf084b89463cf73d5c4b23d7ecc34bfb733a3673900bb9a27b09c71a: Status 404 returned error can't find the container with id 04a5d90baf084b89463cf73d5c4b23d7ecc34bfb733a3673900bb9a27b09c71a Apr 24 22:35:27.585713 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:27.585670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" event={"ID":"c0f1eedd-673d-4061-9acc-eb454788dadf","Type":"ContainerStarted","Data":"04a5d90baf084b89463cf73d5c4b23d7ecc34bfb733a3673900bb9a27b09c71a"} Apr 24 22:35:33.607036 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:33.606999 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerID="a12c6e255195e3042be97172406a789c71fe0da267798851db40a89edd73b60f" exitCode=0 Apr 24 22:35:33.607513 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:33.607083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" event={"ID":"c0f1eedd-673d-4061-9acc-eb454788dadf","Type":"ContainerDied","Data":"a12c6e255195e3042be97172406a789c71fe0da267798851db40a89edd73b60f"} Apr 24 22:35:35.615241 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:35.615153 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerID="aebf895462938b45dddd659b72e4c2c838744810b70c5d77dd338553a8012a7f" exitCode=0 Apr 24 22:35:35.615241 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:35.615194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" event={"ID":"c0f1eedd-673d-4061-9acc-eb454788dadf","Type":"ContainerDied","Data":"aebf895462938b45dddd659b72e4c2c838744810b70c5d77dd338553a8012a7f"} Apr 24 22:35:42.638641 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:42.638574 2575 generic.go:358] "Generic (PLEG): container finished" podID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerID="9c4f7c105c8d218f670c79165c1c7a6d64ea2d6815ccba4a993bf92256b47839" exitCode=0 Apr 24 22:35:42.638994 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:42.638664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" event={"ID":"c0f1eedd-673d-4061-9acc-eb454788dadf","Type":"ContainerDied","Data":"9c4f7c105c8d218f670c79165c1c7a6d64ea2d6815ccba4a993bf92256b47839"} Apr 24 22:35:43.765067 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.765045 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:43.830635 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.830579 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-util\") pod \"c0f1eedd-673d-4061-9acc-eb454788dadf\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " Apr 24 22:35:43.830822 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.830704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-bundle\") pod \"c0f1eedd-673d-4061-9acc-eb454788dadf\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " Apr 24 22:35:43.830822 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.830764 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb6jq\" (UniqueName: \"kubernetes.io/projected/c0f1eedd-673d-4061-9acc-eb454788dadf-kube-api-access-qb6jq\") pod \"c0f1eedd-673d-4061-9acc-eb454788dadf\" (UID: \"c0f1eedd-673d-4061-9acc-eb454788dadf\") " Apr 24 22:35:43.831205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.831172 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-bundle" (OuterVolumeSpecName: "bundle") pod "c0f1eedd-673d-4061-9acc-eb454788dadf" (UID: "c0f1eedd-673d-4061-9acc-eb454788dadf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:35:43.832903 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.832866 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f1eedd-673d-4061-9acc-eb454788dadf-kube-api-access-qb6jq" (OuterVolumeSpecName: "kube-api-access-qb6jq") pod "c0f1eedd-673d-4061-9acc-eb454788dadf" (UID: "c0f1eedd-673d-4061-9acc-eb454788dadf"). InnerVolumeSpecName "kube-api-access-qb6jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:35:43.834393 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.834374 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-util" (OuterVolumeSpecName: "util") pod "c0f1eedd-673d-4061-9acc-eb454788dadf" (UID: "c0f1eedd-673d-4061-9acc-eb454788dadf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:35:43.932061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.931982 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-util\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:35:43.932061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.932014 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0f1eedd-673d-4061-9acc-eb454788dadf-bundle\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:35:43.932061 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:43.932024 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qb6jq\" (UniqueName: \"kubernetes.io/projected/c0f1eedd-673d-4061-9acc-eb454788dadf-kube-api-access-qb6jq\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:35:44.646275 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:44.646247 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" Apr 24 22:35:44.646436 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:44.646244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cdrxj5" event={"ID":"c0f1eedd-673d-4061-9acc-eb454788dadf","Type":"ContainerDied","Data":"04a5d90baf084b89463cf73d5c4b23d7ecc34bfb733a3673900bb9a27b09c71a"} Apr 24 22:35:44.646436 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:44.646361 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04a5d90baf084b89463cf73d5c4b23d7ecc34bfb733a3673900bb9a27b09c71a" Apr 24 22:35:48.112166 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112131 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h"] Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112443 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="util" Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112452 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="util" Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112462 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="extract" Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112467 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="extract" Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112483 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="pull" Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112489 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="pull" Apr 24 22:35:48.112546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.112535 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0f1eedd-673d-4061-9acc-eb454788dadf" containerName="extract" Apr 24 22:35:48.117324 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.117307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.119691 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.119662 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 22:35:48.119819 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.119734 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 22:35:48.119819 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.119742 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-w2zr9\"" Apr 24 22:35:48.119819 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.119743 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 22:35:48.125616 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.125575 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h"] Apr 24 22:35:48.167419 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.167393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e42490de-473d-415f-a94b-484b75f2a249-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h\" (UID: \"e42490de-473d-415f-a94b-484b75f2a249\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.167638 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.167426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84dtv\" (UniqueName: \"kubernetes.io/projected/e42490de-473d-415f-a94b-484b75f2a249-kube-api-access-84dtv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h\" (UID: \"e42490de-473d-415f-a94b-484b75f2a249\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.268824 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.268783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e42490de-473d-415f-a94b-484b75f2a249-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h\" (UID: \"e42490de-473d-415f-a94b-484b75f2a249\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.269020 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.268829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84dtv\" (UniqueName: \"kubernetes.io/projected/e42490de-473d-415f-a94b-484b75f2a249-kube-api-access-84dtv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h\" (UID: \"e42490de-473d-415f-a94b-484b75f2a249\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.271209 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.271185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e42490de-473d-415f-a94b-484b75f2a249-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h\" (UID: \"e42490de-473d-415f-a94b-484b75f2a249\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.280509 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.280483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84dtv\" (UniqueName: \"kubernetes.io/projected/e42490de-473d-415f-a94b-484b75f2a249-kube-api-access-84dtv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h\" (UID: \"e42490de-473d-415f-a94b-484b75f2a249\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.428101 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.428010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:48.551357 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.551329 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h"] Apr 24 22:35:48.554289 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:35:48.554259 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42490de_473d_415f_a94b_484b75f2a249.slice/crio-f58bdbf48d9d1ddeac35c0131ef66555e00a4a8268795594c2b57cbead25cf05 WatchSource:0}: Error finding container f58bdbf48d9d1ddeac35c0131ef66555e00a4a8268795594c2b57cbead25cf05: Status 404 returned error can't find the container with id f58bdbf48d9d1ddeac35c0131ef66555e00a4a8268795594c2b57cbead25cf05 Apr 24 22:35:48.658667 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:48.658628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" event={"ID":"e42490de-473d-415f-a94b-484b75f2a249","Type":"ContainerStarted","Data":"f58bdbf48d9d1ddeac35c0131ef66555e00a4a8268795594c2b57cbead25cf05"} Apr 24 22:35:51.670268 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:51.670226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" event={"ID":"e42490de-473d-415f-a94b-484b75f2a249","Type":"ContainerStarted","Data":"a51d063182a61858cea16dbe9ebec6208b4c663108ee5d7e4dce8d092c2eebee"} Apr 24 22:35:51.670746 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:51.670412 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:35:51.691751 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:51.691696 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" podStartSLOduration=0.677565787 podStartE2EDuration="3.691681935s" podCreationTimestamp="2026-04-24 22:35:48 +0000 UTC" firstStartedPulling="2026-04-24 22:35:48.555917097 +0000 UTC m=+354.477560335" lastFinishedPulling="2026-04-24 22:35:51.570033241 +0000 UTC m=+357.491676483" observedRunningTime="2026-04-24 22:35:51.689475154 +0000 UTC m=+357.611118415" watchObservedRunningTime="2026-04-24 22:35:51.691681935 +0000 UTC m=+357.613325195" Apr 24 22:35:52.141955 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.141923 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2sbf7"] Apr 24 22:35:52.145243 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.145227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.148039 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.147570 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 22:35:52.148039 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.147877 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 22:35:52.148039 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.147925 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8knqq\"" Apr 24 22:35:52.154426 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.154214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2sbf7"] Apr 24 22:35:52.199570 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.199540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.199745 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.199601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b2kw\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-kube-api-access-2b2kw\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.199745 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.199722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d7751a30-f0a8-4a93-86ba-9397fdfd540c-cabundle0\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.301054 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.301019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d7751a30-f0a8-4a93-86ba-9397fdfd540c-cabundle0\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.301241 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.301078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.301241 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.301134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b2kw\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-kube-api-access-2b2kw\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.301241 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.301171 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:35:52.301241 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.301188 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:35:52.301241 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.301198 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2sbf7: references non-existent secret key: ca.crt Apr 24 22:35:52.301442 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.301265 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates podName:d7751a30-f0a8-4a93-86ba-9397fdfd540c nodeName:}" failed. No retries permitted until 2026-04-24 22:35:52.80125045 +0000 UTC m=+358.722893688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates") pod "keda-operator-ffbb595cb-2sbf7" (UID: "d7751a30-f0a8-4a93-86ba-9397fdfd540c") : references non-existent secret key: ca.crt Apr 24 22:35:52.301717 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.301700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d7751a30-f0a8-4a93-86ba-9397fdfd540c-cabundle0\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.313696 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.313665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b2kw\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-kube-api-access-2b2kw\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.768127 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.768093 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-zb4g9"] Apr 24 22:35:52.772052 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.772032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:52.774405 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.774388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 22:35:52.781377 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.781354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zb4g9"] Apr 24 22:35:52.807508 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.807472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:52.807723 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.807605 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:35:52.807723 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.807624 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:35:52.807723 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.807636 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2sbf7: references non-existent secret key: ca.crt Apr 24 22:35:52.807723 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:52.807693 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates podName:d7751a30-f0a8-4a93-86ba-9397fdfd540c nodeName:}" failed. No retries permitted until 2026-04-24 22:35:53.807674664 +0000 UTC m=+359.729317905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates") pod "keda-operator-ffbb595cb-2sbf7" (UID: "d7751a30-f0a8-4a93-86ba-9397fdfd540c") : references non-existent secret key: ca.crt Apr 24 22:35:52.908212 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.908175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwbv\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-kube-api-access-ljwbv\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:52.908212 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:52.908224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-certificates\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.009309 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.009264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwbv\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-kube-api-access-ljwbv\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.009513 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.009326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-certificates\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.009513 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.009485 2575 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 22:35:53.009652 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.009516 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-zb4g9: secret "keda-admission-webhooks-certs" not found Apr 24 22:35:53.009652 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.009584 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-certificates podName:6ec9aec9-5157-4998-b38f-d9bc7b188d68 nodeName:}" failed. No retries permitted until 2026-04-24 22:35:53.50956581 +0000 UTC m=+359.431209064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-certificates") pod "keda-admission-cf49989db-zb4g9" (UID: "6ec9aec9-5157-4998-b38f-d9bc7b188d68") : secret "keda-admission-webhooks-certs" not found Apr 24 22:35:53.026350 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.026276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwbv\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-kube-api-access-ljwbv\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.514875 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.514834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-certificates\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.517342 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.517315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ec9aec9-5157-4998-b38f-d9bc7b188d68-certificates\") pod \"keda-admission-cf49989db-zb4g9\" (UID: \"6ec9aec9-5157-4998-b38f-d9bc7b188d68\") " pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.684408 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.684375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:53.818361 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.818264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:53.818829 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.818441 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:35:53.818829 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.818472 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:35:53.818829 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.818486 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2sbf7: references non-existent secret key: ca.crt Apr 24 22:35:53.818829 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:53.818567 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates podName:d7751a30-f0a8-4a93-86ba-9397fdfd540c nodeName:}" failed. No retries permitted until 2026-04-24 22:35:55.818546934 +0000 UTC m=+361.740190176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates") pod "keda-operator-ffbb595cb-2sbf7" (UID: "d7751a30-f0a8-4a93-86ba-9397fdfd540c") : references non-existent secret key: ca.crt Apr 24 22:35:53.825807 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:53.825775 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zb4g9"] Apr 24 22:35:53.829700 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:35:53.829667 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec9aec9_5157_4998_b38f_d9bc7b188d68.slice/crio-0d582931b713cec9500696e65a1a868398d8bbda4fdf98004c4d1444467bbbfa WatchSource:0}: Error finding container 0d582931b713cec9500696e65a1a868398d8bbda4fdf98004c4d1444467bbbfa: Status 404 returned error can't find the container with id 0d582931b713cec9500696e65a1a868398d8bbda4fdf98004c4d1444467bbbfa Apr 24 22:35:54.681639 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:54.681583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zb4g9" event={"ID":"6ec9aec9-5157-4998-b38f-d9bc7b188d68","Type":"ContainerStarted","Data":"0d582931b713cec9500696e65a1a868398d8bbda4fdf98004c4d1444467bbbfa"} Apr 24 22:35:55.688857 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:55.688819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zb4g9" event={"ID":"6ec9aec9-5157-4998-b38f-d9bc7b188d68","Type":"ContainerStarted","Data":"eeabda3099a226aadcfa6202b64332053309e647f1e30b0babffbf4af497909c"} Apr 24 22:35:55.689299 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:55.688987 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:35:55.712987 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:55.712928 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-zb4g9" podStartSLOduration=2.5142877 podStartE2EDuration="3.712911918s" podCreationTimestamp="2026-04-24 22:35:52 +0000 UTC" firstStartedPulling="2026-04-24 22:35:53.831137701 +0000 UTC m=+359.752780956" lastFinishedPulling="2026-04-24 22:35:55.029761935 +0000 UTC m=+360.951405174" observedRunningTime="2026-04-24 22:35:55.709955434 +0000 UTC m=+361.631598694" watchObservedRunningTime="2026-04-24 22:35:55.712911918 +0000 UTC m=+361.634555178" Apr 24 22:35:55.836535 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:55.836482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:55.836745 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:55.836652 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:35:55.836745 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:55.836669 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:35:55.836745 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:55.836678 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2sbf7: references non-existent secret key: ca.crt Apr 24 22:35:55.836745 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:35:55.836739 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates podName:d7751a30-f0a8-4a93-86ba-9397fdfd540c nodeName:}" failed. No retries permitted until 2026-04-24 22:35:59.836723416 +0000 UTC m=+365.758366676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates") pod "keda-operator-ffbb595cb-2sbf7" (UID: "d7751a30-f0a8-4a93-86ba-9397fdfd540c") : references non-existent secret key: ca.crt Apr 24 22:35:59.869038 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:59.868994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:59.871495 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:59.871471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d7751a30-f0a8-4a93-86ba-9397fdfd540c-certificates\") pod \"keda-operator-ffbb595cb-2sbf7\" (UID: \"d7751a30-f0a8-4a93-86ba-9397fdfd540c\") " pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:35:59.961604 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:35:59.961553 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:36:00.079532 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:00.079507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2sbf7"] Apr 24 22:36:00.082368 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:36:00.082341 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7751a30_f0a8_4a93_86ba_9397fdfd540c.slice/crio-2b1bf8be0e1a2870c73e85b86f9451b01d5d082c2bb3a68dc13127fd695c07d0 WatchSource:0}: Error finding container 2b1bf8be0e1a2870c73e85b86f9451b01d5d082c2bb3a68dc13127fd695c07d0: Status 404 returned error can't find the container with id 2b1bf8be0e1a2870c73e85b86f9451b01d5d082c2bb3a68dc13127fd695c07d0 Apr 24 22:36:00.704638 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:00.704584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" event={"ID":"d7751a30-f0a8-4a93-86ba-9397fdfd540c","Type":"ContainerStarted","Data":"2b1bf8be0e1a2870c73e85b86f9451b01d5d082c2bb3a68dc13127fd695c07d0"} Apr 24 22:36:03.715917 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:03.715880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" event={"ID":"d7751a30-f0a8-4a93-86ba-9397fdfd540c","Type":"ContainerStarted","Data":"8fcbb0ac86ad2bfed6ac011512fd85c1d8f6e542137bc3b14c8171d63d1eaaca"} Apr 24 22:36:03.716358 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:03.715941 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:36:03.731989 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:03.731941 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" podStartSLOduration=8.75334753 podStartE2EDuration="11.731929509s" podCreationTimestamp="2026-04-24 22:35:52 +0000 UTC" firstStartedPulling="2026-04-24 22:36:00.083955182 +0000 UTC m=+366.005598423" lastFinishedPulling="2026-04-24 22:36:03.062537162 +0000 UTC m=+368.984180402" observedRunningTime="2026-04-24 22:36:03.731898721 +0000 UTC m=+369.653541993" watchObservedRunningTime="2026-04-24 22:36:03.731929509 +0000 UTC m=+369.653572770" Apr 24 22:36:12.675899 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:12.675868 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnl6h" Apr 24 22:36:16.694745 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:16.694712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-zb4g9" Apr 24 22:36:24.722152 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:36:24.722124 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-2sbf7" Apr 24 22:37:02.481242 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.481200 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-sh29b"] Apr 24 22:37:02.484804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.484784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.487442 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.487410 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 22:37:02.487561 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.487499 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 22:37:02.488529 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.488511 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 22:37:02.488650 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.488543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-9mbfm\"" Apr 24 22:37:02.495546 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.495523 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-sh29b"] Apr 24 22:37:02.525075 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.525039 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-dr49k"] Apr 24 22:37:02.528078 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.528059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.530808 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.530579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 22:37:02.531105 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.531086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-zj965\"" Apr 24 22:37:02.537418 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.537397 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dr49k"] Apr 24 22:37:02.589750 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.589713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzfd\" (UniqueName: \"kubernetes.io/projected/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-kube-api-access-mdzfd\") pod \"kserve-controller-manager-549bc44c6d-sh29b\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.589931 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.589775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-cert\") pod \"kserve-controller-manager-549bc44c6d-sh29b\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.690305 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.690264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-cert\") pod \"kserve-controller-manager-549bc44c6d-sh29b\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.690478 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.690343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d20746c0-08ea-4a8c-8c3b-d47523a2cb9b-data\") pod \"seaweedfs-86cc847c5c-dr49k\" (UID: \"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b\") " pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.690478 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.690405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzfd\" (UniqueName: \"kubernetes.io/projected/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-kube-api-access-mdzfd\") pod \"kserve-controller-manager-549bc44c6d-sh29b\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.690478 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.690436 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjf8f\" (UniqueName: \"kubernetes.io/projected/d20746c0-08ea-4a8c-8c3b-d47523a2cb9b-kube-api-access-gjf8f\") pod \"seaweedfs-86cc847c5c-dr49k\" (UID: \"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b\") " pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.692805 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.692778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-cert\") pod \"kserve-controller-manager-549bc44c6d-sh29b\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.699359 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.699335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzfd\" (UniqueName: \"kubernetes.io/projected/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-kube-api-access-mdzfd\") pod \"kserve-controller-manager-549bc44c6d-sh29b\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.791650 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.791513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d20746c0-08ea-4a8c-8c3b-d47523a2cb9b-data\") pod \"seaweedfs-86cc847c5c-dr49k\" (UID: \"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b\") " pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.791650 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.791619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjf8f\" (UniqueName: \"kubernetes.io/projected/d20746c0-08ea-4a8c-8c3b-d47523a2cb9b-kube-api-access-gjf8f\") pod \"seaweedfs-86cc847c5c-dr49k\" (UID: \"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b\") " pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.791984 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.791958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d20746c0-08ea-4a8c-8c3b-d47523a2cb9b-data\") pod \"seaweedfs-86cc847c5c-dr49k\" (UID: \"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b\") " pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.797541 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.797516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:02.800803 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.800782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjf8f\" (UniqueName: \"kubernetes.io/projected/d20746c0-08ea-4a8c-8c3b-d47523a2cb9b-kube-api-access-gjf8f\") pod \"seaweedfs-86cc847c5c-dr49k\" (UID: \"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b\") " pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.839013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.838977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:02.931721 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.931668 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-sh29b"] Apr 24 22:37:02.934029 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:37:02.934002 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc30c24_fa87_4fd3_8e8a_04860342d2b6.slice/crio-6753271e99e73726726918c029aa2d93082884e72200a498396fdaf6619c015e WatchSource:0}: Error finding container 6753271e99e73726726918c029aa2d93082884e72200a498396fdaf6619c015e: Status 404 returned error can't find the container with id 6753271e99e73726726918c029aa2d93082884e72200a498396fdaf6619c015e Apr 24 22:37:02.978839 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:02.978819 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dr49k"] Apr 24 22:37:02.981360 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:37:02.981335 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd20746c0_08ea_4a8c_8c3b_d47523a2cb9b.slice/crio-b347683793c87028517ee346049dcaf7a594f31dc565e6ad859f5601583d8bf1 WatchSource:0}: Error finding container b347683793c87028517ee346049dcaf7a594f31dc565e6ad859f5601583d8bf1: Status 404 returned error can't find the container with id b347683793c87028517ee346049dcaf7a594f31dc565e6ad859f5601583d8bf1 Apr 24 22:37:03.922296 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:03.922244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" event={"ID":"0bc30c24-fa87-4fd3-8e8a-04860342d2b6","Type":"ContainerStarted","Data":"6753271e99e73726726918c029aa2d93082884e72200a498396fdaf6619c015e"} Apr 24 22:37:03.923663 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:03.923625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dr49k" event={"ID":"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b","Type":"ContainerStarted","Data":"b347683793c87028517ee346049dcaf7a594f31dc565e6ad859f5601583d8bf1"} Apr 24 22:37:06.936334 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:06.936241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dr49k" event={"ID":"d20746c0-08ea-4a8c-8c3b-d47523a2cb9b","Type":"ContainerStarted","Data":"061d65ee73c00a56740eeb18327da88ad6aa00ae4eef756ca972054903a7cc7c"} Apr 24 22:37:06.936334 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:06.936310 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:06.937625 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:06.937583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" event={"ID":"0bc30c24-fa87-4fd3-8e8a-04860342d2b6","Type":"ContainerStarted","Data":"e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa"} Apr 24 22:37:06.937724 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:06.937707 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:06.955171 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:06.955121 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-dr49k" podStartSLOduration=1.4281732630000001 podStartE2EDuration="4.955107158s" podCreationTimestamp="2026-04-24 22:37:02 +0000 UTC" firstStartedPulling="2026-04-24 22:37:02.982689853 +0000 UTC m=+428.904333095" lastFinishedPulling="2026-04-24 22:37:06.509623736 +0000 UTC m=+432.431266990" observedRunningTime="2026-04-24 22:37:06.953086058 +0000 UTC m=+432.874729319" watchObservedRunningTime="2026-04-24 22:37:06.955107158 +0000 UTC m=+432.876750418" Apr 24 22:37:06.971034 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:06.970986 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" podStartSLOduration=1.492124061 podStartE2EDuration="4.970971503s" podCreationTimestamp="2026-04-24 22:37:02 +0000 UTC" firstStartedPulling="2026-04-24 22:37:02.935261962 +0000 UTC m=+428.856905204" lastFinishedPulling="2026-04-24 22:37:06.414109396 +0000 UTC m=+432.335752646" observedRunningTime="2026-04-24 22:37:06.970109376 +0000 UTC m=+432.891752636" watchObservedRunningTime="2026-04-24 22:37:06.970971503 +0000 UTC m=+432.892614763" Apr 24 22:37:12.945655 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:12.945559 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-dr49k" Apr 24 22:37:37.949063 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:37.949034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:38.222421 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.222343 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-sh29b"] Apr 24 22:37:38.222568 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.222540 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" podUID="0bc30c24-fa87-4fd3-8e8a-04860342d2b6" containerName="manager" containerID="cri-o://e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa" gracePeriod=10 Apr 24 22:37:38.248892 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.248864 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-tcgps"] Apr 24 22:37:38.252258 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.252237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.260439 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.260414 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-tcgps"] Apr 24 22:37:38.288528 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.288500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnllb\" (UniqueName: \"kubernetes.io/projected/10346e36-5134-4c43-9cdc-868be1c90c55-kube-api-access-nnllb\") pod \"kserve-controller-manager-549bc44c6d-tcgps\" (UID: \"10346e36-5134-4c43-9cdc-868be1c90c55\") " pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.288707 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.288581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10346e36-5134-4c43-9cdc-868be1c90c55-cert\") pod \"kserve-controller-manager-549bc44c6d-tcgps\" (UID: \"10346e36-5134-4c43-9cdc-868be1c90c55\") " pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.389861 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.389815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10346e36-5134-4c43-9cdc-868be1c90c55-cert\") pod \"kserve-controller-manager-549bc44c6d-tcgps\" (UID: \"10346e36-5134-4c43-9cdc-868be1c90c55\") " pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.390044 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.389902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnllb\" (UniqueName: \"kubernetes.io/projected/10346e36-5134-4c43-9cdc-868be1c90c55-kube-api-access-nnllb\") pod \"kserve-controller-manager-549bc44c6d-tcgps\" (UID: \"10346e36-5134-4c43-9cdc-868be1c90c55\") " pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.393365 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.392968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10346e36-5134-4c43-9cdc-868be1c90c55-cert\") pod \"kserve-controller-manager-549bc44c6d-tcgps\" (UID: \"10346e36-5134-4c43-9cdc-868be1c90c55\") " pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.399070 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.399041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnllb\" (UniqueName: \"kubernetes.io/projected/10346e36-5134-4c43-9cdc-868be1c90c55-kube-api-access-nnllb\") pod \"kserve-controller-manager-549bc44c6d-tcgps\" (UID: \"10346e36-5134-4c43-9cdc-868be1c90c55\") " pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.477672 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.477613 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:38.591203 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.591169 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-cert\") pod \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " Apr 24 22:37:38.591367 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.591216 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzfd\" (UniqueName: \"kubernetes.io/projected/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-kube-api-access-mdzfd\") pod \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\" (UID: \"0bc30c24-fa87-4fd3-8e8a-04860342d2b6\") " Apr 24 22:37:38.593431 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.593407 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-cert" (OuterVolumeSpecName: "cert") pod "0bc30c24-fa87-4fd3-8e8a-04860342d2b6" (UID: "0bc30c24-fa87-4fd3-8e8a-04860342d2b6"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:37:38.593539 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.593469 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-kube-api-access-mdzfd" (OuterVolumeSpecName: "kube-api-access-mdzfd") pod "0bc30c24-fa87-4fd3-8e8a-04860342d2b6" (UID: "0bc30c24-fa87-4fd3-8e8a-04860342d2b6"). InnerVolumeSpecName "kube-api-access-mdzfd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:38.599618 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.599601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:38.692718 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.692662 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-cert\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:37:38.692718 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.692703 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdzfd\" (UniqueName: \"kubernetes.io/projected/0bc30c24-fa87-4fd3-8e8a-04860342d2b6-kube-api-access-mdzfd\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:37:38.719361 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:38.719334 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-tcgps"] Apr 24 22:37:38.721460 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:37:38.721429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10346e36_5134_4c43_9cdc_868be1c90c55.slice/crio-37bca507e5e8e5be2834ddbb6ce2b91f3d201c275e2d45947d6d182317533193 WatchSource:0}: Error finding container 37bca507e5e8e5be2834ddbb6ce2b91f3d201c275e2d45947d6d182317533193: Status 404 returned error can't find the container with id 37bca507e5e8e5be2834ddbb6ce2b91f3d201c275e2d45947d6d182317533193 Apr 24 22:37:39.046435 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.046403 2575 generic.go:358] "Generic (PLEG): container finished" podID="0bc30c24-fa87-4fd3-8e8a-04860342d2b6" containerID="e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa" exitCode=0 Apr 24 22:37:39.046879 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.046460 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" Apr 24 22:37:39.046879 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.046481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" event={"ID":"0bc30c24-fa87-4fd3-8e8a-04860342d2b6","Type":"ContainerDied","Data":"e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa"} Apr 24 22:37:39.046879 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.046511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-sh29b" event={"ID":"0bc30c24-fa87-4fd3-8e8a-04860342d2b6","Type":"ContainerDied","Data":"6753271e99e73726726918c029aa2d93082884e72200a498396fdaf6619c015e"} Apr 24 22:37:39.046879 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.046526 2575 scope.go:117] "RemoveContainer" containerID="e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa" Apr 24 22:37:39.047736 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.047716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" event={"ID":"10346e36-5134-4c43-9cdc-868be1c90c55","Type":"ContainerStarted","Data":"37bca507e5e8e5be2834ddbb6ce2b91f3d201c275e2d45947d6d182317533193"} Apr 24 22:37:39.055151 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.055136 2575 scope.go:117] "RemoveContainer" containerID="e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa" Apr 24 22:37:39.055431 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:37:39.055412 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa\": container with ID starting with e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa not found: ID does not exist" containerID="e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa" Apr 24 22:37:39.055477 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.055440 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa"} err="failed to get container status \"e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa\": rpc error: code = NotFound desc = could not find container \"e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa\": container with ID starting with e6cf75b9e97abbeffceb4af42adfe96cbb36af779378909bf57f617d310742fa not found: ID does not exist" Apr 24 22:37:39.067407 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.067382 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-sh29b"] Apr 24 22:37:39.070163 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:39.070143 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-sh29b"] Apr 24 22:37:40.052468 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:40.052433 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" event={"ID":"10346e36-5134-4c43-9cdc-868be1c90c55","Type":"ContainerStarted","Data":"09fa9b3324ddf08765de8e19fac2685b4ec8315d083fe78074a3e6ab5b2cce8d"} Apr 24 22:37:40.052916 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:40.052477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:37:40.071489 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:40.071440 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" podStartSLOduration=1.69183395 podStartE2EDuration="2.071425474s" podCreationTimestamp="2026-04-24 22:37:38 +0000 UTC" firstStartedPulling="2026-04-24 22:37:38.722796156 +0000 UTC m=+464.644439395" lastFinishedPulling="2026-04-24 22:37:39.102387678 +0000 UTC m=+465.024030919" observedRunningTime="2026-04-24 22:37:40.070908135 +0000 UTC m=+465.992551396" watchObservedRunningTime="2026-04-24 22:37:40.071425474 +0000 UTC m=+465.993068734" Apr 24 22:37:40.566600 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:37:40.566551 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc30c24-fa87-4fd3-8e8a-04860342d2b6" path="/var/lib/kubelet/pods/0bc30c24-fa87-4fd3-8e8a-04860342d2b6/volumes" Apr 24 22:38:11.062316 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.062282 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-549bc44c6d-tcgps" Apr 24 22:38:11.889811 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.889781 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-fv66f"] Apr 24 22:38:11.890140 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.890127 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bc30c24-fa87-4fd3-8e8a-04860342d2b6" containerName="manager" Apr 24 22:38:11.890184 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.890140 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc30c24-fa87-4fd3-8e8a-04860342d2b6" containerName="manager" Apr 24 22:38:11.890218 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.890192 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bc30c24-fa87-4fd3-8e8a-04860342d2b6" containerName="manager" Apr 24 22:38:11.893205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.893181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:11.896288 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.896253 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 22:38:11.896288 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.896264 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-pk67t\"" Apr 24 22:38:11.907465 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:11.907443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-fv66f"] Apr 24 22:38:12.087425 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.087391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2cfe8849-ad32-4aff-a904-f77a727d6b37-tls-certs\") pod \"model-serving-api-86f7b4b499-fv66f\" (UID: \"2cfe8849-ad32-4aff-a904-f77a727d6b37\") " pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.087808 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.087446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfcww\" (UniqueName: \"kubernetes.io/projected/2cfe8849-ad32-4aff-a904-f77a727d6b37-kube-api-access-kfcww\") pod \"model-serving-api-86f7b4b499-fv66f\" (UID: \"2cfe8849-ad32-4aff-a904-f77a727d6b37\") " pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.188919 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.188879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2cfe8849-ad32-4aff-a904-f77a727d6b37-tls-certs\") pod \"model-serving-api-86f7b4b499-fv66f\" (UID: \"2cfe8849-ad32-4aff-a904-f77a727d6b37\") " pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.189093 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.188959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfcww\" (UniqueName: \"kubernetes.io/projected/2cfe8849-ad32-4aff-a904-f77a727d6b37-kube-api-access-kfcww\") pod \"model-serving-api-86f7b4b499-fv66f\" (UID: \"2cfe8849-ad32-4aff-a904-f77a727d6b37\") " pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.191493 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.191475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2cfe8849-ad32-4aff-a904-f77a727d6b37-tls-certs\") pod \"model-serving-api-86f7b4b499-fv66f\" (UID: \"2cfe8849-ad32-4aff-a904-f77a727d6b37\") " pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.196626 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.196566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfcww\" (UniqueName: \"kubernetes.io/projected/2cfe8849-ad32-4aff-a904-f77a727d6b37-kube-api-access-kfcww\") pod \"model-serving-api-86f7b4b499-fv66f\" (UID: \"2cfe8849-ad32-4aff-a904-f77a727d6b37\") " pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.204485 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.204463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:12.328926 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:12.328900 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-fv66f"] Apr 24 22:38:12.331117 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:38:12.331089 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cfe8849_ad32_4aff_a904_f77a727d6b37.slice/crio-7a93dbc709df498fbde07b90035141a9add36a8f71b1c27f7ccabb1124e08f92 WatchSource:0}: Error finding container 7a93dbc709df498fbde07b90035141a9add36a8f71b1c27f7ccabb1124e08f92: Status 404 returned error can't find the container with id 7a93dbc709df498fbde07b90035141a9add36a8f71b1c27f7ccabb1124e08f92 Apr 24 22:38:13.163637 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:13.163578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-fv66f" event={"ID":"2cfe8849-ad32-4aff-a904-f77a727d6b37","Type":"ContainerStarted","Data":"7a93dbc709df498fbde07b90035141a9add36a8f71b1c27f7ccabb1124e08f92"} Apr 24 22:38:14.168276 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:14.168239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-fv66f" event={"ID":"2cfe8849-ad32-4aff-a904-f77a727d6b37","Type":"ContainerStarted","Data":"d68bf9bf26ade960817ffe314eacf3ac80d92d0c3aa8be68590b1ff77cf02052"} Apr 24 22:38:14.168715 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:14.168363 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:14.187342 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:14.187294 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-fv66f" podStartSLOduration=1.899906562 podStartE2EDuration="3.187278052s" podCreationTimestamp="2026-04-24 22:38:11 +0000 UTC" firstStartedPulling="2026-04-24 22:38:12.332777364 +0000 UTC m=+498.254420605" lastFinishedPulling="2026-04-24 22:38:13.620148853 +0000 UTC m=+499.541792095" observedRunningTime="2026-04-24 22:38:14.18504539 +0000 UTC m=+500.106688674" watchObservedRunningTime="2026-04-24 22:38:14.187278052 +0000 UTC m=+500.108921312" Apr 24 22:38:25.176734 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:25.176705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-fv66f" Apr 24 22:38:47.271549 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.271457 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv"] Apr 24 22:38:47.274924 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.274905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.279508 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.279485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1b9dc-predictor-serving-cert\"" Apr 24 22:38:47.279674 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.279650 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-58j9x\"" Apr 24 22:38:47.279801 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.279498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\"" Apr 24 22:38:47.280035 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.280019 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:38:47.280679 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.280663 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:38:47.294391 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.294357 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv"] Apr 24 22:38:47.384702 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.384654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz58w\" (UniqueName: \"kubernetes.io/projected/e27eabfc-86ce-49e3-b17b-33d45d7c4352-kube-api-access-qz58w\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.384865 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.384778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27eabfc-86ce-49e3-b17b-33d45d7c4352-error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.384865 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.384851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27eabfc-86ce-49e3-b17b-33d45d7c4352-proxy-tls\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.486323 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.486279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27eabfc-86ce-49e3-b17b-33d45d7c4352-proxy-tls\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.486530 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.486374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz58w\" (UniqueName: \"kubernetes.io/projected/e27eabfc-86ce-49e3-b17b-33d45d7c4352-kube-api-access-qz58w\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.486530 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.486433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27eabfc-86ce-49e3-b17b-33d45d7c4352-error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.487146 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.487125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27eabfc-86ce-49e3-b17b-33d45d7c4352-error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.488926 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.488893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27eabfc-86ce-49e3-b17b-33d45d7c4352-proxy-tls\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.494490 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.494461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz58w\" (UniqueName: \"kubernetes.io/projected/e27eabfc-86ce-49e3-b17b-33d45d7c4352-kube-api-access-qz58w\") pod \"error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.587517 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.587431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:38:47.724842 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:47.724808 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv"] Apr 24 22:38:47.727452 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:38:47.727401 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode27eabfc_86ce_49e3_b17b_33d45d7c4352.slice/crio-7f972f22b8c5fa00b7f6fc0542efd6e1d757cdeb12c05d8fa04add12e7280ce4 WatchSource:0}: Error finding container 7f972f22b8c5fa00b7f6fc0542efd6e1d757cdeb12c05d8fa04add12e7280ce4: Status 404 returned error can't find the container with id 7f972f22b8c5fa00b7f6fc0542efd6e1d757cdeb12c05d8fa04add12e7280ce4 Apr 24 22:38:48.045820 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.045787 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686"] Apr 24 22:38:48.050719 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.050685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.053224 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.053203 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 24 22:38:48.053339 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.053203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 24 22:38:48.060025 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.059999 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686"] Apr 24 22:38:48.192817 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.192778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.193841 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.193799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsm7\" (UniqueName: \"kubernetes.io/projected/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kube-api-access-ljsm7\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.194084 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.194063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.194168 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.194134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.290504 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.290441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" event={"ID":"e27eabfc-86ce-49e3-b17b-33d45d7c4352","Type":"ContainerStarted","Data":"7f972f22b8c5fa00b7f6fc0542efd6e1d757cdeb12c05d8fa04add12e7280ce4"} Apr 24 22:38:48.296969 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.295634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.296969 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.295685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.296969 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.295727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.296969 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.295770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsm7\" (UniqueName: \"kubernetes.io/projected/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kube-api-access-ljsm7\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.296969 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.296879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.298128 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.298053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.302149 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.301634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.305012 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.304962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsm7\" (UniqueName: \"kubernetes.io/projected/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kube-api-access-ljsm7\") pod \"isvc-sklearn-graph-2-predictor-7db9dc6879-jr686\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.362261 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.362214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:38:48.550567 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:48.550486 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686"] Apr 24 22:38:48.557479 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:38:48.557445 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff3301f_15eb_4fca_a9cf_0ef15031bd39.slice/crio-e4141f9cdbbd1158bdd4990d28682bf66b9cad4047d33ab5ff1db960b2deb173 WatchSource:0}: Error finding container e4141f9cdbbd1158bdd4990d28682bf66b9cad4047d33ab5ff1db960b2deb173: Status 404 returned error can't find the container with id e4141f9cdbbd1158bdd4990d28682bf66b9cad4047d33ab5ff1db960b2deb173 Apr 24 22:38:49.297498 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:38:49.297456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerStarted","Data":"e4141f9cdbbd1158bdd4990d28682bf66b9cad4047d33ab5ff1db960b2deb173"} Apr 24 22:39:01.347292 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:01.347248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerStarted","Data":"3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a"} Apr 24 22:39:01.350048 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:01.349988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" event={"ID":"e27eabfc-86ce-49e3-b17b-33d45d7c4352","Type":"ContainerStarted","Data":"d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37"} Apr 24 22:39:03.358122 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:03.358089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" event={"ID":"e27eabfc-86ce-49e3-b17b-33d45d7c4352","Type":"ContainerStarted","Data":"cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb"} Apr 24 22:39:03.358646 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:03.358320 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:39:03.358646 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:03.358345 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:39:03.359796 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:03.359748 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:39:03.375692 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:03.375643 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podStartSLOduration=0.949874396 podStartE2EDuration="16.375628134s" podCreationTimestamp="2026-04-24 22:38:47 +0000 UTC" firstStartedPulling="2026-04-24 22:38:47.729683385 +0000 UTC m=+533.651326627" lastFinishedPulling="2026-04-24 22:39:03.155437123 +0000 UTC m=+549.077080365" observedRunningTime="2026-04-24 22:39:03.374812398 +0000 UTC m=+549.296455659" watchObservedRunningTime="2026-04-24 22:39:03.375628134 +0000 UTC m=+549.297271392" Apr 24 22:39:04.361827 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:04.361781 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:39:05.365361 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:05.365328 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerID="3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a" exitCode=0 Apr 24 22:39:05.365746 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:05.365399 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerDied","Data":"3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a"} Apr 24 22:39:09.367232 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:09.367200 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:39:09.367836 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:09.367801 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:39:11.389858 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:11.389823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerStarted","Data":"611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886"} Apr 24 22:39:11.390228 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:11.389866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerStarted","Data":"0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d"} Apr 24 22:39:11.390228 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:11.390069 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:39:11.409378 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:11.409320 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podStartSLOduration=1.055728196 podStartE2EDuration="23.409302289s" podCreationTimestamp="2026-04-24 22:38:48 +0000 UTC" firstStartedPulling="2026-04-24 22:38:48.560790061 +0000 UTC m=+534.482433305" lastFinishedPulling="2026-04-24 22:39:10.914364159 +0000 UTC m=+556.836007398" observedRunningTime="2026-04-24 22:39:11.408344804 +0000 UTC m=+557.329988066" watchObservedRunningTime="2026-04-24 22:39:11.409302289 +0000 UTC m=+557.330945614" Apr 24 22:39:12.396986 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:12.396957 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:39:12.398316 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:12.398290 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:39:13.400288 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:13.400250 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:39:18.404711 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:18.404679 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:39:18.405337 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:18.405308 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:39:19.368400 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:19.368356 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:39:28.405774 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:28.405734 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:39:29.368696 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:29.368655 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:39:38.406205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:38.406157 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:39:39.368355 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:39.368313 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:39:48.405552 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:48.405507 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:39:49.368753 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:49.368721 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:39:58.405744 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:39:58.405702 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:40:08.405412 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:08.405330 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:40:17.193647 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.193611 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv"] Apr 24 22:40:17.194013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.193966 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" containerID="cri-o://d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37" gracePeriod=30 Apr 24 22:40:17.194013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.193996 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kube-rbac-proxy" containerID="cri-o://cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb" gracePeriod=30 Apr 24 22:40:17.349556 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.349518 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr"] Apr 24 22:40:17.353676 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.353654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.356350 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.356322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e938d-predictor-serving-cert\"" Apr 24 22:40:17.356458 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.356371 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e938d-kube-rbac-proxy-sar-config\"" Apr 24 22:40:17.365659 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.365629 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr"] Apr 24 22:40:17.491723 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.491618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.491723 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.491673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68aef45b-ce2a-449b-9034-2c357befa414-error-404-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.491723 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.491707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xdz\" (UniqueName: \"kubernetes.io/projected/68aef45b-ce2a-449b-9034-2c357befa414-kube-api-access-v5xdz\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.592965 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.592927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68aef45b-ce2a-449b-9034-2c357befa414-error-404-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.593171 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.592981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xdz\" (UniqueName: \"kubernetes.io/projected/68aef45b-ce2a-449b-9034-2c357befa414-kube-api-access-v5xdz\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.593171 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.593081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.593171 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:40:17.593169 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-e938d-predictor-serving-cert: secret "error-404-isvc-e938d-predictor-serving-cert" not found Apr 24 22:40:17.593344 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:40:17.593239 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls podName:68aef45b-ce2a-449b-9034-2c357befa414 nodeName:}" failed. No retries permitted until 2026-04-24 22:40:18.09321838 +0000 UTC m=+624.014861628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls") pod "error-404-isvc-e938d-predictor-8478f5b96-brdgr" (UID: "68aef45b-ce2a-449b-9034-2c357befa414") : secret "error-404-isvc-e938d-predictor-serving-cert" not found Apr 24 22:40:17.593763 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.593737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68aef45b-ce2a-449b-9034-2c357befa414-error-404-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.601992 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.601962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xdz\" (UniqueName: \"kubernetes.io/projected/68aef45b-ce2a-449b-9034-2c357befa414-kube-api-access-v5xdz\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:17.623476 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.623443 2575 generic.go:358] "Generic (PLEG): container finished" podID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerID="cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb" exitCode=2 Apr 24 22:40:17.623644 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:17.623522 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" event={"ID":"e27eabfc-86ce-49e3-b17b-33d45d7c4352","Type":"ContainerDied","Data":"cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb"} Apr 24 22:40:18.098804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.098769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:18.101227 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.101197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls\") pod \"error-404-isvc-e938d-predictor-8478f5b96-brdgr\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:18.274665 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.274615 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:18.405887 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.405857 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:40:18.416432 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.416395 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr"] Apr 24 22:40:18.418953 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:40:18.418922 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68aef45b_ce2a_449b_9034_2c357befa414.slice/crio-3e6e10205147c3d9821fea0ee4cfddd2beafeb2e6685600a6fb415bce2c98b33 WatchSource:0}: Error finding container 3e6e10205147c3d9821fea0ee4cfddd2beafeb2e6685600a6fb415bce2c98b33: Status 404 returned error can't find the container with id 3e6e10205147c3d9821fea0ee4cfddd2beafeb2e6685600a6fb415bce2c98b33 Apr 24 22:40:18.420748 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.420729 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:40:18.628866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.628777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" event={"ID":"68aef45b-ce2a-449b-9034-2c357befa414","Type":"ContainerStarted","Data":"bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e"} Apr 24 22:40:18.628866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.628813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" event={"ID":"68aef45b-ce2a-449b-9034-2c357befa414","Type":"ContainerStarted","Data":"d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac"} Apr 24 22:40:18.628866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.628830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" event={"ID":"68aef45b-ce2a-449b-9034-2c357befa414","Type":"ContainerStarted","Data":"3e6e10205147c3d9821fea0ee4cfddd2beafeb2e6685600a6fb415bce2c98b33"} Apr 24 22:40:18.629104 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.629057 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:18.629199 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.629181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:18.630457 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.630432 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:40:18.646539 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:18.646494 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podStartSLOduration=1.64648157 podStartE2EDuration="1.64648157s" podCreationTimestamp="2026-04-24 22:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:18.645343175 +0000 UTC m=+624.566986439" watchObservedRunningTime="2026-04-24 22:40:18.64648157 +0000 UTC m=+624.568124830" Apr 24 22:40:19.362757 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:19.362715 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 24 22:40:19.367801 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:19.367773 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:40:19.632604 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:19.632498 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:40:20.350838 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.350816 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:40:20.421929 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.421846 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz58w\" (UniqueName: \"kubernetes.io/projected/e27eabfc-86ce-49e3-b17b-33d45d7c4352-kube-api-access-qz58w\") pod \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " Apr 24 22:40:20.421929 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.421924 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27eabfc-86ce-49e3-b17b-33d45d7c4352-error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " Apr 24 22:40:20.422335 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.421954 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27eabfc-86ce-49e3-b17b-33d45d7c4352-proxy-tls\") pod \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\" (UID: \"e27eabfc-86ce-49e3-b17b-33d45d7c4352\") " Apr 24 22:40:20.422335 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.422301 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e27eabfc-86ce-49e3-b17b-33d45d7c4352-error-404-isvc-1b9dc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-1b9dc-kube-rbac-proxy-sar-config") pod "e27eabfc-86ce-49e3-b17b-33d45d7c4352" (UID: "e27eabfc-86ce-49e3-b17b-33d45d7c4352"). InnerVolumeSpecName "error-404-isvc-1b9dc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:40:20.424233 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.424195 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27eabfc-86ce-49e3-b17b-33d45d7c4352-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e27eabfc-86ce-49e3-b17b-33d45d7c4352" (UID: "e27eabfc-86ce-49e3-b17b-33d45d7c4352"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:40:20.424233 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.424218 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27eabfc-86ce-49e3-b17b-33d45d7c4352-kube-api-access-qz58w" (OuterVolumeSpecName: "kube-api-access-qz58w") pod "e27eabfc-86ce-49e3-b17b-33d45d7c4352" (UID: "e27eabfc-86ce-49e3-b17b-33d45d7c4352"). InnerVolumeSpecName "kube-api-access-qz58w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:40:20.523205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.523151 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qz58w\" (UniqueName: \"kubernetes.io/projected/e27eabfc-86ce-49e3-b17b-33d45d7c4352-kube-api-access-qz58w\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:40:20.523205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.523198 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27eabfc-86ce-49e3-b17b-33d45d7c4352-error-404-isvc-1b9dc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:40:20.523205 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.523210 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27eabfc-86ce-49e3-b17b-33d45d7c4352-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:40:20.638002 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.637966 2575 generic.go:358] "Generic (PLEG): container finished" podID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerID="d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37" exitCode=0 Apr 24 22:40:20.638186 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.638018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" event={"ID":"e27eabfc-86ce-49e3-b17b-33d45d7c4352","Type":"ContainerDied","Data":"d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37"} Apr 24 22:40:20.638186 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.638043 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" Apr 24 22:40:20.638186 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.638055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv" event={"ID":"e27eabfc-86ce-49e3-b17b-33d45d7c4352","Type":"ContainerDied","Data":"7f972f22b8c5fa00b7f6fc0542efd6e1d757cdeb12c05d8fa04add12e7280ce4"} Apr 24 22:40:20.638186 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.638071 2575 scope.go:117] "RemoveContainer" containerID="cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb" Apr 24 22:40:20.647807 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.647781 2575 scope.go:117] "RemoveContainer" containerID="d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37" Apr 24 22:40:20.655800 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.655776 2575 scope.go:117] "RemoveContainer" containerID="cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb" Apr 24 22:40:20.656061 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:40:20.656042 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb\": container with ID starting with cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb not found: ID does not exist" containerID="cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb" Apr 24 22:40:20.656131 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.656067 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb"} err="failed to get container status \"cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb\": rpc error: code = NotFound desc = could not find container \"cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb\": container with ID starting with cb17eac41879d4cc986a5ee7b319827f2e8f4e1e215faa125fac4a2dd88942eb not found: ID does not exist" Apr 24 22:40:20.656131 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.656088 2575 scope.go:117] "RemoveContainer" containerID="d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37" Apr 24 22:40:20.656338 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:40:20.656321 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37\": container with ID starting with d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37 not found: ID does not exist" containerID="d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37" Apr 24 22:40:20.656397 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.656343 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37"} err="failed to get container status \"d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37\": rpc error: code = NotFound desc = could not find container \"d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37\": container with ID starting with d6230e45389a02a8fe0db428439c8e2da88cc5cb07377a910fecf9d673e88f37 not found: ID does not exist" Apr 24 22:40:20.656550 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.656530 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv"] Apr 24 22:40:20.660095 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:20.660072 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1b9dc-predictor-7c988b6956-f7lkv"] Apr 24 22:40:22.565660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:22.565630 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" path="/var/lib/kubelet/pods/e27eabfc-86ce-49e3-b17b-33d45d7c4352/volumes" Apr 24 22:40:24.637310 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:24.637281 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:40:24.637931 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:24.637898 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:40:34.638557 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:34.638518 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:40:44.637975 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:44.637936 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:40:54.637898 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:54.637861 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:40:56.943317 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:56.943282 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686"] Apr 24 22:40:56.943883 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:56.943823 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" containerID="cri-o://0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d" gracePeriod=30 Apr 24 22:40:56.944192 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:56.944010 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kube-rbac-proxy" containerID="cri-o://611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886" gracePeriod=30 Apr 24 22:40:57.101937 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.101901 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt"] Apr 24 22:40:57.102302 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.102290 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kube-rbac-proxy" Apr 24 22:40:57.102346 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.102304 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kube-rbac-proxy" Apr 24 22:40:57.102346 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.102321 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" Apr 24 22:40:57.102346 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.102327 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" Apr 24 22:40:57.102438 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.102387 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kserve-container" Apr 24 22:40:57.102438 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.102396 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e27eabfc-86ce-49e3-b17b-33d45d7c4352" containerName="kube-rbac-proxy" Apr 24 22:40:57.105491 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.105473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.107866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.107837 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f8431-predictor-serving-cert\"" Apr 24 22:40:57.107866 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.107845 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f8431-kube-rbac-proxy-sar-config\"" Apr 24 22:40:57.114120 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.114095 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt"] Apr 24 22:40:57.236424 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.236322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77c00532-53e1-4786-8092-b743bfe09f10-error-404-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.236424 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.236372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc9gl\" (UniqueName: \"kubernetes.io/projected/77c00532-53e1-4786-8092-b743bfe09f10-kube-api-access-tc9gl\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.236424 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.236401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.337075 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.337034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77c00532-53e1-4786-8092-b743bfe09f10-error-404-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.337277 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.337087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc9gl\" (UniqueName: \"kubernetes.io/projected/77c00532-53e1-4786-8092-b743bfe09f10-kube-api-access-tc9gl\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.337277 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.337120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.337277 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:40:57.337260 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-f8431-predictor-serving-cert: secret "error-404-isvc-f8431-predictor-serving-cert" not found Apr 24 22:40:57.337436 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:40:57.337318 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls podName:77c00532-53e1-4786-8092-b743bfe09f10 nodeName:}" failed. No retries permitted until 2026-04-24 22:40:57.837298979 +0000 UTC m=+663.758942218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls") pod "error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" (UID: "77c00532-53e1-4786-8092-b743bfe09f10") : secret "error-404-isvc-f8431-predictor-serving-cert" not found Apr 24 22:40:57.337747 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.337725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77c00532-53e1-4786-8092-b743bfe09f10-error-404-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.345678 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.345655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc9gl\" (UniqueName: \"kubernetes.io/projected/77c00532-53e1-4786-8092-b743bfe09f10-kube-api-access-tc9gl\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.763768 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.763731 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerID="611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886" exitCode=2 Apr 24 22:40:57.763937 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.763809 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerDied","Data":"611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886"} Apr 24 22:40:57.842450 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.842415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:57.844904 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:57.844871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls\") pod \"error-404-isvc-f8431-predictor-6b847f4b94-q6dxt\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:58.017437 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.017356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:58.146958 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.146932 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt"] Apr 24 22:40:58.149523 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:40:58.149486 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c00532_53e1_4786_8092_b743bfe09f10.slice/crio-d169c339bd05d124fe44f2b6bb0e6482ee0aca25686252c48fdb59b84cb51b90 WatchSource:0}: Error finding container d169c339bd05d124fe44f2b6bb0e6482ee0aca25686252c48fdb59b84cb51b90: Status 404 returned error can't find the container with id d169c339bd05d124fe44f2b6bb0e6482ee0aca25686252c48fdb59b84cb51b90 Apr 24 22:40:58.401417 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.401315 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 22:40:58.405627 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.405576 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 22:40:58.768852 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.768815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" event={"ID":"77c00532-53e1-4786-8092-b743bfe09f10","Type":"ContainerStarted","Data":"0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826"} Apr 24 22:40:58.768852 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.768854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" event={"ID":"77c00532-53e1-4786-8092-b743bfe09f10","Type":"ContainerStarted","Data":"1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e"} Apr 24 22:40:58.769060 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.768866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" event={"ID":"77c00532-53e1-4786-8092-b743bfe09f10","Type":"ContainerStarted","Data":"d169c339bd05d124fe44f2b6bb0e6482ee0aca25686252c48fdb59b84cb51b90"} Apr 24 22:40:58.769060 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.768917 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:58.788102 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:58.788053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podStartSLOduration=1.788037111 podStartE2EDuration="1.788037111s" podCreationTimestamp="2026-04-24 22:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:58.78682055 +0000 UTC m=+664.708463812" watchObservedRunningTime="2026-04-24 22:40:58.788037111 +0000 UTC m=+664.709680433" Apr 24 22:40:59.771925 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:59.771898 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:40:59.773337 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:40:59.773308 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:41:00.775218 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:00.775179 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:41:01.484369 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.484346 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:41:01.581054 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.580970 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " Apr 24 22:41:01.581054 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.581032 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kserve-provision-location\") pod \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " Apr 24 22:41:01.581300 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.581093 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-proxy-tls\") pod \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " Apr 24 22:41:01.581300 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.581115 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljsm7\" (UniqueName: \"kubernetes.io/projected/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kube-api-access-ljsm7\") pod \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\" (UID: \"4ff3301f-15eb-4fca-a9cf-0ef15031bd39\") " Apr 24 22:41:01.581405 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.581355 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ff3301f-15eb-4fca-a9cf-0ef15031bd39" (UID: "4ff3301f-15eb-4fca-a9cf-0ef15031bd39"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:41:01.581405 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.581375 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "4ff3301f-15eb-4fca-a9cf-0ef15031bd39" (UID: "4ff3301f-15eb-4fca-a9cf-0ef15031bd39"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:41:01.583201 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.583178 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ff3301f-15eb-4fca-a9cf-0ef15031bd39" (UID: "4ff3301f-15eb-4fca-a9cf-0ef15031bd39"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:41:01.583284 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.583181 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kube-api-access-ljsm7" (OuterVolumeSpecName: "kube-api-access-ljsm7") pod "4ff3301f-15eb-4fca-a9cf-0ef15031bd39" (UID: "4ff3301f-15eb-4fca-a9cf-0ef15031bd39"). InnerVolumeSpecName "kube-api-access-ljsm7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:41:01.681985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.681943 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljsm7\" (UniqueName: \"kubernetes.io/projected/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kube-api-access-ljsm7\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.681985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.681977 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.681985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.681990 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-kserve-provision-location\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.681985 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.682000 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ff3301f-15eb-4fca-a9cf-0ef15031bd39-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:41:01.780655 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.780622 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerID="0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d" exitCode=0 Apr 24 22:41:01.781062 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.780694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerDied","Data":"0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d"} Apr 24 22:41:01.781062 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.780726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" event={"ID":"4ff3301f-15eb-4fca-a9cf-0ef15031bd39","Type":"ContainerDied","Data":"e4141f9cdbbd1158bdd4990d28682bf66b9cad4047d33ab5ff1db960b2deb173"} Apr 24 22:41:01.781062 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.780744 2575 scope.go:117] "RemoveContainer" containerID="611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886" Apr 24 22:41:01.781062 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.780760 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686" Apr 24 22:41:01.789364 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.789342 2575 scope.go:117] "RemoveContainer" containerID="0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d" Apr 24 22:41:01.797232 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.797214 2575 scope.go:117] "RemoveContainer" containerID="3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a" Apr 24 22:41:01.803646 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.803621 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686"] Apr 24 22:41:01.805954 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.805931 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7db9dc6879-jr686"] Apr 24 22:41:01.806141 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.806126 2575 scope.go:117] "RemoveContainer" containerID="611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886" Apr 24 22:41:01.806406 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:41:01.806387 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886\": container with ID starting with 611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886 not found: ID does not exist" containerID="611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886" Apr 24 22:41:01.806482 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.806418 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886"} err="failed to get container status \"611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886\": rpc error: code = NotFound desc = could not find container \"611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886\": container with ID starting with 611a76604eb14ca8fccdeff3d1995b8dfeb13264caf56135f9430a0b12db4886 not found: ID does not exist" Apr 24 22:41:01.806482 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.806444 2575 scope.go:117] "RemoveContainer" containerID="0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d" Apr 24 22:41:01.806717 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:41:01.806698 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d\": container with ID starting with 0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d not found: ID does not exist" containerID="0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d" Apr 24 22:41:01.806793 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.806727 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d"} err="failed to get container status \"0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d\": rpc error: code = NotFound desc = could not find container \"0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d\": container with ID starting with 0897ced12a429104b30cd9190d8a6893425dc0bb7b096f20bad9c114465f6c8d not found: ID does not exist" Apr 24 22:41:01.806793 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.806748 2575 scope.go:117] "RemoveContainer" containerID="3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a" Apr 24 22:41:01.806962 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:41:01.806947 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a\": container with ID starting with 3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a not found: ID does not exist" containerID="3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a" Apr 24 22:41:01.807013 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:01.806969 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a"} err="failed to get container status \"3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a\": rpc error: code = NotFound desc = could not find container \"3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a\": container with ID starting with 3b4e3a7a2470957b42d715c2ba3d8a326eb5877c6a2fd8b7efa9a8af59fc560a not found: ID does not exist" Apr 24 22:41:02.566611 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:02.566557 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" path="/var/lib/kubelet/pods/4ff3301f-15eb-4fca-a9cf-0ef15031bd39/volumes" Apr 24 22:41:04.639363 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:04.639334 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:41:05.779165 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:05.779138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:41:05.779584 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:05.779560 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:41:15.779559 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:15.779521 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:41:25.779942 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:25.779903 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:41:35.779663 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:35.779619 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:41:45.780804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:41:45.780727 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:49:31.974361 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:31.974283 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr"] Apr 24 22:49:31.974881 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:31.974571 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" containerID="cri-o://d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac" gracePeriod=30 Apr 24 22:49:31.974881 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:31.974643 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kube-rbac-proxy" containerID="cri-o://bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e" gracePeriod=30 Apr 24 22:49:32.056130 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056095 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn"] Apr 24 22:49:32.056515 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056502 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="storage-initializer" Apr 24 22:49:32.056577 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056515 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="storage-initializer" Apr 24 22:49:32.056577 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056528 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kube-rbac-proxy" Apr 24 22:49:32.056577 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056533 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kube-rbac-proxy" Apr 24 22:49:32.056577 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056545 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" Apr 24 22:49:32.056577 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056551 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" Apr 24 22:49:32.056876 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056657 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kube-rbac-proxy" Apr 24 22:49:32.056876 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.056667 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ff3301f-15eb-4fca-a9cf-0ef15031bd39" containerName="kserve-container" Apr 24 22:49:32.059973 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.059951 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.062306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.062277 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-5c921-kube-rbac-proxy-sar-config\"" Apr 24 22:49:32.062306 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.062280 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-5c921-predictor-serving-cert\"" Apr 24 22:49:32.070418 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.070393 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn"] Apr 24 22:49:32.177238 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.177206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3f98fde-f0aa-415a-a8a4-02c829157a8f-proxy-tls\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.177417 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.177279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3f98fde-f0aa-415a-a8a4-02c829157a8f-error-404-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.177482 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.177411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97twg\" (UniqueName: \"kubernetes.io/projected/c3f98fde-f0aa-415a-a8a4-02c829157a8f-kube-api-access-97twg\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.278883 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.278794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97twg\" (UniqueName: \"kubernetes.io/projected/c3f98fde-f0aa-415a-a8a4-02c829157a8f-kube-api-access-97twg\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.278883 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.278841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3f98fde-f0aa-415a-a8a4-02c829157a8f-proxy-tls\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.279118 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.278912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3f98fde-f0aa-415a-a8a4-02c829157a8f-error-404-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.279658 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.279630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3f98fde-f0aa-415a-a8a4-02c829157a8f-error-404-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.281285 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.281267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3f98fde-f0aa-415a-a8a4-02c829157a8f-proxy-tls\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.290114 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.290092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97twg\" (UniqueName: \"kubernetes.io/projected/c3f98fde-f0aa-415a-a8a4-02c829157a8f-kube-api-access-97twg\") pod \"error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.373520 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.373480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:32.500990 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.500958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn"] Apr 24 22:49:32.504226 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:49:32.504198 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3f98fde_f0aa_415a_a8a4_02c829157a8f.slice/crio-5ff7c5012cd6f6d0bbe9282798b576aab6cbe9021121767f364e3aea03f2ab3e WatchSource:0}: Error finding container 5ff7c5012cd6f6d0bbe9282798b576aab6cbe9021121767f364e3aea03f2ab3e: Status 404 returned error can't find the container with id 5ff7c5012cd6f6d0bbe9282798b576aab6cbe9021121767f364e3aea03f2ab3e Apr 24 22:49:32.505962 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.505946 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:49:32.523456 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.523425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" event={"ID":"c3f98fde-f0aa-415a-a8a4-02c829157a8f","Type":"ContainerStarted","Data":"5ff7c5012cd6f6d0bbe9282798b576aab6cbe9021121767f364e3aea03f2ab3e"} Apr 24 22:49:32.525105 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.525080 2575 generic.go:358] "Generic (PLEG): container finished" podID="68aef45b-ce2a-449b-9034-2c357befa414" containerID="bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e" exitCode=2 Apr 24 22:49:32.525197 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:32.525137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" event={"ID":"68aef45b-ce2a-449b-9034-2c357befa414","Type":"ContainerDied","Data":"bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e"} Apr 24 22:49:33.529581 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:33.529547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" event={"ID":"c3f98fde-f0aa-415a-a8a4-02c829157a8f","Type":"ContainerStarted","Data":"e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab"} Apr 24 22:49:33.529581 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:33.529584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" event={"ID":"c3f98fde-f0aa-415a-a8a4-02c829157a8f","Type":"ContainerStarted","Data":"09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579"} Apr 24 22:49:33.530018 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:33.529754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:33.548320 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:33.548259 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podStartSLOduration=1.548238858 podStartE2EDuration="1.548238858s" podCreationTimestamp="2026-04-24 22:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:49:33.545445344 +0000 UTC m=+1179.467088628" watchObservedRunningTime="2026-04-24 22:49:33.548238858 +0000 UTC m=+1179.469882120" Apr 24 22:49:34.533454 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:34.533422 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:34.534531 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:34.534500 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:49:34.633178 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:34.633129 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:49:34.638524 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:34.638494 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:49:35.118714 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.118692 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:49:35.205262 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.205171 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68aef45b-ce2a-449b-9034-2c357befa414-error-404-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"68aef45b-ce2a-449b-9034-2c357befa414\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " Apr 24 22:49:35.205262 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.205222 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls\") pod \"68aef45b-ce2a-449b-9034-2c357befa414\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " Apr 24 22:49:35.205262 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.205243 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5xdz\" (UniqueName: \"kubernetes.io/projected/68aef45b-ce2a-449b-9034-2c357befa414-kube-api-access-v5xdz\") pod \"68aef45b-ce2a-449b-9034-2c357befa414\" (UID: \"68aef45b-ce2a-449b-9034-2c357befa414\") " Apr 24 22:49:35.205636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.205579 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68aef45b-ce2a-449b-9034-2c357befa414-error-404-isvc-e938d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e938d-kube-rbac-proxy-sar-config") pod "68aef45b-ce2a-449b-9034-2c357befa414" (UID: "68aef45b-ce2a-449b-9034-2c357befa414"). InnerVolumeSpecName "error-404-isvc-e938d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:49:35.207427 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.207395 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "68aef45b-ce2a-449b-9034-2c357befa414" (UID: "68aef45b-ce2a-449b-9034-2c357befa414"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:49:35.207427 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.207396 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68aef45b-ce2a-449b-9034-2c357befa414-kube-api-access-v5xdz" (OuterVolumeSpecName: "kube-api-access-v5xdz") pod "68aef45b-ce2a-449b-9034-2c357befa414" (UID: "68aef45b-ce2a-449b-9034-2c357befa414"). InnerVolumeSpecName "kube-api-access-v5xdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:49:35.306663 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.306620 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68aef45b-ce2a-449b-9034-2c357befa414-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:49:35.306663 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.306653 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5xdz\" (UniqueName: \"kubernetes.io/projected/68aef45b-ce2a-449b-9034-2c357befa414-kube-api-access-v5xdz\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:49:35.306663 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.306664 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68aef45b-ce2a-449b-9034-2c357befa414-error-404-isvc-e938d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:49:35.537934 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.537836 2575 generic.go:358] "Generic (PLEG): container finished" podID="68aef45b-ce2a-449b-9034-2c357befa414" containerID="d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac" exitCode=0 Apr 24 22:49:35.537934 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.537917 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" Apr 24 22:49:35.538428 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.537925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" event={"ID":"68aef45b-ce2a-449b-9034-2c357befa414","Type":"ContainerDied","Data":"d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac"} Apr 24 22:49:35.538428 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.537965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr" event={"ID":"68aef45b-ce2a-449b-9034-2c357befa414","Type":"ContainerDied","Data":"3e6e10205147c3d9821fea0ee4cfddd2beafeb2e6685600a6fb415bce2c98b33"} Apr 24 22:49:35.538428 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.537982 2575 scope.go:117] "RemoveContainer" containerID="bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e" Apr 24 22:49:35.538654 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.538628 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:49:35.547887 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.547867 2575 scope.go:117] "RemoveContainer" containerID="d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac" Apr 24 22:49:35.555351 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.555331 2575 scope.go:117] "RemoveContainer" containerID="bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e" Apr 24 22:49:35.555627 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:49:35.555585 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e\": container with ID starting with bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e not found: ID does not exist" containerID="bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e" Apr 24 22:49:35.555710 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.555638 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e"} err="failed to get container status \"bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e\": rpc error: code = NotFound desc = could not find container \"bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e\": container with ID starting with bfa7b2ebddfc429a4b3418bd4a1430c46dc39f2d86ed8e4313e689d3ebf4ed6e not found: ID does not exist" Apr 24 22:49:35.555710 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.555657 2575 scope.go:117] "RemoveContainer" containerID="d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac" Apr 24 22:49:35.555899 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:49:35.555881 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac\": container with ID starting with d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac not found: ID does not exist" containerID="d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac" Apr 24 22:49:35.555937 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.555905 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac"} err="failed to get container status \"d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac\": rpc error: code = NotFound desc = could not find container \"d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac\": container with ID starting with d83624e540a3a1404934e1427afdde18067c7410f414710717ca0f899604f1ac not found: ID does not exist" Apr 24 22:49:35.559909 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.559887 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr"] Apr 24 22:49:35.563386 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:35.563367 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e938d-predictor-8478f5b96-brdgr"] Apr 24 22:49:36.566115 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:36.566071 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68aef45b-ce2a-449b-9034-2c357befa414" path="/var/lib/kubelet/pods/68aef45b-ce2a-449b-9034-2c357befa414/volumes" Apr 24 22:49:40.543140 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:40.543109 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:49:40.543774 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:40.543741 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:49:50.544122 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:49:50.544081 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:50:00.544282 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:00.544242 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:50:10.543833 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:10.543789 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:50:12.000697 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.000661 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt"] Apr 24 22:50:12.001207 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.001151 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" containerID="cri-o://1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e" gracePeriod=30 Apr 24 22:50:12.001679 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.001221 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kube-rbac-proxy" containerID="cri-o://0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826" gracePeriod=30 Apr 24 22:50:12.056766 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.056729 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z"] Apr 24 22:50:12.057303 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.057288 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" Apr 24 22:50:12.057345 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.057307 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" Apr 24 22:50:12.057345 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.057327 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kube-rbac-proxy" Apr 24 22:50:12.057345 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.057336 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kube-rbac-proxy" Apr 24 22:50:12.057451 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.057441 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kube-rbac-proxy" Apr 24 22:50:12.057486 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.057459 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="68aef45b-ce2a-449b-9034-2c357befa414" containerName="kserve-container" Apr 24 22:50:12.061072 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.061050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.063555 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.063531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-32c7c-predictor-serving-cert\"" Apr 24 22:50:12.063555 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.063541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-32c7c-kube-rbac-proxy-sar-config\"" Apr 24 22:50:12.068664 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.068630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z"] Apr 24 22:50:12.112293 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.112258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxp7\" (UniqueName: \"kubernetes.io/projected/675b0d26-e528-409f-951e-a218d9d2b936-kube-api-access-tqxp7\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.112449 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.112313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/675b0d26-e528-409f-951e-a218d9d2b936-error-404-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.112449 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.112430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.212917 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.212878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.213100 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.213028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxp7\" (UniqueName: \"kubernetes.io/projected/675b0d26-e528-409f-951e-a218d9d2b936-kube-api-access-tqxp7\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.213100 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:12.213044 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-serving-cert: secret "error-404-isvc-32c7c-predictor-serving-cert" not found Apr 24 22:50:12.213100 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.213087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/675b0d26-e528-409f-951e-a218d9d2b936-error-404-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.213258 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:12.213120 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls podName:675b0d26-e528-409f-951e-a218d9d2b936 nodeName:}" failed. No retries permitted until 2026-04-24 22:50:12.713100062 +0000 UTC m=+1218.634743308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls") pod "error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" (UID: "675b0d26-e528-409f-951e-a218d9d2b936") : secret "error-404-isvc-32c7c-predictor-serving-cert" not found Apr 24 22:50:12.213650 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.213632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/675b0d26-e528-409f-951e-a218d9d2b936-error-404-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.221263 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.221245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxp7\" (UniqueName: \"kubernetes.io/projected/675b0d26-e528-409f-951e-a218d9d2b936-kube-api-access-tqxp7\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.666088 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.666052 2575 generic.go:358] "Generic (PLEG): container finished" podID="77c00532-53e1-4786-8092-b743bfe09f10" containerID="0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826" exitCode=2 Apr 24 22:50:12.666264 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.666125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" event={"ID":"77c00532-53e1-4786-8092-b743bfe09f10","Type":"ContainerDied","Data":"0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826"} Apr 24 22:50:12.718653 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.718614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.721100 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.721075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls\") pod \"error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:12.973175 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:12.973137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:13.100949 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:13.100923 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z"] Apr 24 22:50:13.103113 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:50:13.103078 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675b0d26_e528_409f_951e_a218d9d2b936.slice/crio-7413b6eb820c0a82ada5dc19cdd69fca39c40a9bab5caab4c1bed13e98adc41f WatchSource:0}: Error finding container 7413b6eb820c0a82ada5dc19cdd69fca39c40a9bab5caab4c1bed13e98adc41f: Status 404 returned error can't find the container with id 7413b6eb820c0a82ada5dc19cdd69fca39c40a9bab5caab4c1bed13e98adc41f Apr 24 22:50:13.671477 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:13.671385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" event={"ID":"675b0d26-e528-409f-951e-a218d9d2b936","Type":"ContainerStarted","Data":"ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44"} Apr 24 22:50:13.671477 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:13.671423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" event={"ID":"675b0d26-e528-409f-951e-a218d9d2b936","Type":"ContainerStarted","Data":"4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945"} Apr 24 22:50:13.671477 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:13.671435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" event={"ID":"675b0d26-e528-409f-951e-a218d9d2b936","Type":"ContainerStarted","Data":"7413b6eb820c0a82ada5dc19cdd69fca39c40a9bab5caab4c1bed13e98adc41f"} Apr 24 22:50:13.671477 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:13.671465 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:13.688983 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:13.688941 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podStartSLOduration=1.688926349 podStartE2EDuration="1.688926349s" podCreationTimestamp="2026-04-24 22:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:13.687477241 +0000 UTC m=+1219.609120502" watchObservedRunningTime="2026-04-24 22:50:13.688926349 +0000 UTC m=+1219.610569611" Apr 24 22:50:14.674986 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:14.674955 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:14.676311 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:14.676277 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 22:50:15.158659 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.158635 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:50:15.241139 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.241040 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc9gl\" (UniqueName: \"kubernetes.io/projected/77c00532-53e1-4786-8092-b743bfe09f10-kube-api-access-tc9gl\") pod \"77c00532-53e1-4786-8092-b743bfe09f10\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " Apr 24 22:50:15.241139 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.241103 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77c00532-53e1-4786-8092-b743bfe09f10-error-404-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"77c00532-53e1-4786-8092-b743bfe09f10\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " Apr 24 22:50:15.241353 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.241146 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls\") pod \"77c00532-53e1-4786-8092-b743bfe09f10\" (UID: \"77c00532-53e1-4786-8092-b743bfe09f10\") " Apr 24 22:50:15.241466 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.241442 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c00532-53e1-4786-8092-b743bfe09f10-error-404-isvc-f8431-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f8431-kube-rbac-proxy-sar-config") pod "77c00532-53e1-4786-8092-b743bfe09f10" (UID: "77c00532-53e1-4786-8092-b743bfe09f10"). InnerVolumeSpecName "error-404-isvc-f8431-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:15.243208 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.243187 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c00532-53e1-4786-8092-b743bfe09f10-kube-api-access-tc9gl" (OuterVolumeSpecName: "kube-api-access-tc9gl") pod "77c00532-53e1-4786-8092-b743bfe09f10" (UID: "77c00532-53e1-4786-8092-b743bfe09f10"). InnerVolumeSpecName "kube-api-access-tc9gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:50:15.243281 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.243267 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "77c00532-53e1-4786-8092-b743bfe09f10" (UID: "77c00532-53e1-4786-8092-b743bfe09f10"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:15.341975 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.341933 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77c00532-53e1-4786-8092-b743bfe09f10-error-404-isvc-f8431-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:50:15.341975 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.341969 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77c00532-53e1-4786-8092-b743bfe09f10-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:50:15.341975 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.341979 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc9gl\" (UniqueName: \"kubernetes.io/projected/77c00532-53e1-4786-8092-b743bfe09f10-kube-api-access-tc9gl\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:50:15.678925 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.678846 2575 generic.go:358] "Generic (PLEG): container finished" podID="77c00532-53e1-4786-8092-b743bfe09f10" containerID="1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e" exitCode=0 Apr 24 22:50:15.678925 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.678919 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" Apr 24 22:50:15.679408 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.678935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" event={"ID":"77c00532-53e1-4786-8092-b743bfe09f10","Type":"ContainerDied","Data":"1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e"} Apr 24 22:50:15.679408 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.678973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt" event={"ID":"77c00532-53e1-4786-8092-b743bfe09f10","Type":"ContainerDied","Data":"d169c339bd05d124fe44f2b6bb0e6482ee0aca25686252c48fdb59b84cb51b90"} Apr 24 22:50:15.679408 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.678989 2575 scope.go:117] "RemoveContainer" containerID="0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826" Apr 24 22:50:15.679636 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.679585 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 22:50:15.688330 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.688309 2575 scope.go:117] "RemoveContainer" containerID="1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e" Apr 24 22:50:15.695518 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.695501 2575 scope.go:117] "RemoveContainer" containerID="0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826" Apr 24 22:50:15.695765 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:15.695744 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826\": container with ID starting with 0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826 not found: ID does not exist" containerID="0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826" Apr 24 22:50:15.695816 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.695772 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826"} err="failed to get container status \"0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826\": rpc error: code = NotFound desc = could not find container \"0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826\": container with ID starting with 0bc74e9a09797aa93c2b114958109fd0e32708b103b1f769acdc8de1f630c826 not found: ID does not exist" Apr 24 22:50:15.695816 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.695791 2575 scope.go:117] "RemoveContainer" containerID="1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e" Apr 24 22:50:15.695979 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:15.695960 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e\": container with ID starting with 1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e not found: ID does not exist" containerID="1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e" Apr 24 22:50:15.696021 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.695985 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e"} err="failed to get container status \"1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e\": rpc error: code = NotFound desc = could not find container \"1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e\": container with ID starting with 1fe9f435734c545e96b9df2cf05a129539630b034e7d05331c9ccfd086d9654e not found: ID does not exist" Apr 24 22:50:15.703003 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.702979 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt"] Apr 24 22:50:15.708413 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:15.708392 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f8431-predictor-6b847f4b94-q6dxt"] Apr 24 22:50:16.566129 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:16.566097 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77c00532-53e1-4786-8092-b743bfe09f10" path="/var/lib/kubelet/pods/77c00532-53e1-4786-8092-b743bfe09f10/volumes" Apr 24 22:50:20.544618 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:20.544553 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:50:20.683988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:20.683956 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:50:20.684461 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:20.684424 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 22:50:30.684537 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:30.684496 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 22:50:40.685407 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:40.685320 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 22:50:42.366566 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.366530 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn"] Apr 24 22:50:42.367037 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.366835 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" containerID="cri-o://09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579" gracePeriod=30 Apr 24 22:50:42.367037 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.366912 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kube-rbac-proxy" containerID="cri-o://e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab" gracePeriod=30 Apr 24 22:50:42.415324 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415290 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw"] Apr 24 22:50:42.415861 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415835 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kube-rbac-proxy" Apr 24 22:50:42.415861 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415856 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kube-rbac-proxy" Apr 24 22:50:42.416028 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415879 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" Apr 24 22:50:42.416028 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415885 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" Apr 24 22:50:42.416028 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415954 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kube-rbac-proxy" Apr 24 22:50:42.416028 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.415972 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="77c00532-53e1-4786-8092-b743bfe09f10" containerName="kserve-container" Apr 24 22:50:42.419203 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.419178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.421731 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.421706 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-4aba3-kube-rbac-proxy-sar-config\"" Apr 24 22:50:42.421845 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.421790 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-4aba3-predictor-serving-cert\"" Apr 24 22:50:42.430326 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.430297 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw"] Apr 24 22:50:42.583660 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.583584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.583833 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.583673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-error-404-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.583833 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.583756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kw6j\" (UniqueName: \"kubernetes.io/projected/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-kube-api-access-9kw6j\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.684568 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.684489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kw6j\" (UniqueName: \"kubernetes.io/projected/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-kube-api-access-9kw6j\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.684568 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.684563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.684848 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.684628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-error-404-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.684848 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:42.684736 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-serving-cert: secret "error-404-isvc-4aba3-predictor-serving-cert" not found Apr 24 22:50:42.684848 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:42.684819 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls podName:5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21 nodeName:}" failed. No retries permitted until 2026-04-24 22:50:43.18479612 +0000 UTC m=+1249.106439380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls") pod "error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" (UID: "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21") : secret "error-404-isvc-4aba3-predictor-serving-cert" not found Apr 24 22:50:42.685246 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.685226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-error-404-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.692911 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.692889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kw6j\" (UniqueName: \"kubernetes.io/projected/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-kube-api-access-9kw6j\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:42.778926 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.778892 2575 generic.go:358] "Generic (PLEG): container finished" podID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerID="e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab" exitCode=2 Apr 24 22:50:42.779112 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:42.778960 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" event={"ID":"c3f98fde-f0aa-415a-a8a4-02c829157a8f","Type":"ContainerDied","Data":"e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab"} Apr 24 22:50:43.188318 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.188279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:43.190831 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.190811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls\") pod \"error-404-isvc-4aba3-predictor-6f567c46fb-kdltw\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:43.331034 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.330994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:43.457494 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.457469 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw"] Apr 24 22:50:43.459608 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:50:43.459566 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aed5c0b_de0f_4bf1_b1e7_bf379bb3ee21.slice/crio-e7c73224d426a3451a693a93b0f677f5758e405620bc7b8dedb312c5d412e6f1 WatchSource:0}: Error finding container e7c73224d426a3451a693a93b0f677f5758e405620bc7b8dedb312c5d412e6f1: Status 404 returned error can't find the container with id e7c73224d426a3451a693a93b0f677f5758e405620bc7b8dedb312c5d412e6f1 Apr 24 22:50:43.783804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.783708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" event={"ID":"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21","Type":"ContainerStarted","Data":"739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808"} Apr 24 22:50:43.783804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.783744 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" event={"ID":"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21","Type":"ContainerStarted","Data":"467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227"} Apr 24 22:50:43.783804 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.783757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" event={"ID":"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21","Type":"ContainerStarted","Data":"e7c73224d426a3451a693a93b0f677f5758e405620bc7b8dedb312c5d412e6f1"} Apr 24 22:50:43.784077 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.783853 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:43.803869 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:43.803810 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podStartSLOduration=1.8037901920000001 podStartE2EDuration="1.803790192s" podCreationTimestamp="2026-04-24 22:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:43.800541972 +0000 UTC m=+1249.722185233" watchObservedRunningTime="2026-04-24 22:50:43.803790192 +0000 UTC m=+1249.725433455" Apr 24 22:50:44.787491 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:44.787453 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:44.788765 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:44.788736 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 22:50:45.614963 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.614941 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:50:45.713068 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.712966 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97twg\" (UniqueName: \"kubernetes.io/projected/c3f98fde-f0aa-415a-a8a4-02c829157a8f-kube-api-access-97twg\") pod \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " Apr 24 22:50:45.713068 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.713018 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3f98fde-f0aa-415a-a8a4-02c829157a8f-proxy-tls\") pod \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " Apr 24 22:50:45.713068 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.713048 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3f98fde-f0aa-415a-a8a4-02c829157a8f-error-404-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\" (UID: \"c3f98fde-f0aa-415a-a8a4-02c829157a8f\") " Apr 24 22:50:45.713518 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.713477 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f98fde-f0aa-415a-a8a4-02c829157a8f-error-404-isvc-5c921-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-5c921-kube-rbac-proxy-sar-config") pod "c3f98fde-f0aa-415a-a8a4-02c829157a8f" (UID: "c3f98fde-f0aa-415a-a8a4-02c829157a8f"). InnerVolumeSpecName "error-404-isvc-5c921-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:45.715198 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.715175 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f98fde-f0aa-415a-a8a4-02c829157a8f-kube-api-access-97twg" (OuterVolumeSpecName: "kube-api-access-97twg") pod "c3f98fde-f0aa-415a-a8a4-02c829157a8f" (UID: "c3f98fde-f0aa-415a-a8a4-02c829157a8f"). InnerVolumeSpecName "kube-api-access-97twg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:50:45.715198 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.715191 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f98fde-f0aa-415a-a8a4-02c829157a8f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c3f98fde-f0aa-415a-a8a4-02c829157a8f" (UID: "c3f98fde-f0aa-415a-a8a4-02c829157a8f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:45.791949 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.791915 2575 generic.go:358] "Generic (PLEG): container finished" podID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerID="09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579" exitCode=0 Apr 24 22:50:45.792355 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.791988 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" Apr 24 22:50:45.792355 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.792001 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" event={"ID":"c3f98fde-f0aa-415a-a8a4-02c829157a8f","Type":"ContainerDied","Data":"09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579"} Apr 24 22:50:45.792355 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.792036 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" event={"ID":"c3f98fde-f0aa-415a-a8a4-02c829157a8f","Type":"ContainerDied","Data":"5ff7c5012cd6f6d0bbe9282798b576aab6cbe9021121767f364e3aea03f2ab3e"} Apr 24 22:50:45.792355 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.792051 2575 scope.go:117] "RemoveContainer" containerID="e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab" Apr 24 22:50:45.792571 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.792541 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 22:50:45.801096 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.801076 2575 scope.go:117] "RemoveContainer" containerID="09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579" Apr 24 22:50:45.816247 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.814154 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn"] Apr 24 22:50:45.816247 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.814880 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97twg\" (UniqueName: \"kubernetes.io/projected/c3f98fde-f0aa-415a-a8a4-02c829157a8f-kube-api-access-97twg\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.816247 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.814906 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3f98fde-f0aa-415a-a8a4-02c829157a8f-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.816247 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.814922 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3f98fde-f0aa-415a-a8a4-02c829157a8f-error-404-isvc-5c921-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.819198 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.819167 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn"] Apr 24 22:50:45.833642 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.833616 2575 scope.go:117] "RemoveContainer" containerID="e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab" Apr 24 22:50:45.833940 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:45.833916 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab\": container with ID starting with e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab not found: ID does not exist" containerID="e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab" Apr 24 22:50:45.833987 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.833954 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab"} err="failed to get container status \"e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab\": rpc error: code = NotFound desc = could not find container \"e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab\": container with ID starting with e3be6a068e4e9e329948fc8a325249592877b5463c0aa6e9d9c565771c2d82ab not found: ID does not exist" Apr 24 22:50:45.833987 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.833974 2575 scope.go:117] "RemoveContainer" containerID="09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579" Apr 24 22:50:45.834220 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:50:45.834199 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579\": container with ID starting with 09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579 not found: ID does not exist" containerID="09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579" Apr 24 22:50:45.834256 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:45.834227 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579"} err="failed to get container status \"09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579\": rpc error: code = NotFound desc = could not find container \"09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579\": container with ID starting with 09d705762e0b3f81136ed2604cde152723b3624f9d51b80fb9b6dccd58457579 not found: ID does not exist" Apr 24 22:50:46.539712 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:46.539660 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5c921-predictor-78cfcc75d6-6fxpn" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 24 22:50:46.566187 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:46.566155 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" path="/var/lib/kubelet/pods/c3f98fde-f0aa-415a-a8a4-02c829157a8f/volumes" Apr 24 22:50:50.685017 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:50.684976 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 22:50:50.797696 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:50.797662 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:50:50.798214 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:50:50.798181 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 22:51:00.684752 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:00.684721 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:51:00.798664 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:00.798629 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 22:51:10.798704 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:10.798661 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 22:51:20.798838 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:20.798789 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 22:51:22.200543 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.200451 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z"] Apr 24 22:51:22.201032 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.200858 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" containerID="cri-o://4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945" gracePeriod=30 Apr 24 22:51:22.201032 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.200956 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kube-rbac-proxy" containerID="cri-o://ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44" gracePeriod=30 Apr 24 22:51:22.300663 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.300626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755"] Apr 24 22:51:22.301240 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.301216 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" Apr 24 22:51:22.301406 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.301390 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" Apr 24 22:51:22.301489 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.301445 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kube-rbac-proxy" Apr 24 22:51:22.301489 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.301459 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kube-rbac-proxy" Apr 24 22:51:22.301685 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.301669 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kube-rbac-proxy" Apr 24 22:51:22.301793 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.301691 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3f98fde-f0aa-415a-a8a4-02c829157a8f" containerName="kserve-container" Apr 24 22:51:22.306876 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.306640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.316856 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.316835 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2e6fe-predictor-serving-cert\"" Apr 24 22:51:22.317268 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.317094 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\"" Apr 24 22:51:22.320024 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.320000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755"] Apr 24 22:51:22.434988 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.434950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e25e494-c495-44fd-b89c-009820b8b20c-proxy-tls\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.435167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.434998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rqsx\" (UniqueName: \"kubernetes.io/projected/4e25e494-c495-44fd-b89c-009820b8b20c-kube-api-access-7rqsx\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.435167 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.435082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e25e494-c495-44fd-b89c-009820b8b20c-error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.535813 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.535721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rqsx\" (UniqueName: \"kubernetes.io/projected/4e25e494-c495-44fd-b89c-009820b8b20c-kube-api-access-7rqsx\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.535813 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.535793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e25e494-c495-44fd-b89c-009820b8b20c-error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.536039 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.535882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e25e494-c495-44fd-b89c-009820b8b20c-proxy-tls\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.536424 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.536401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e25e494-c495-44fd-b89c-009820b8b20c-error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.538281 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.538256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e25e494-c495-44fd-b89c-009820b8b20c-proxy-tls\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.544141 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.544111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rqsx\" (UniqueName: \"kubernetes.io/projected/4e25e494-c495-44fd-b89c-009820b8b20c-kube-api-access-7rqsx\") pod \"error-404-isvc-2e6fe-predictor-89f79697d-42755\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.621727 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.621686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.747542 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.747505 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755"] Apr 24 22:51:22.751991 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:51:22.751964 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e25e494_c495_44fd_b89c_009820b8b20c.slice/crio-8a05455c415022a546691b294aadfa9b141185ce0cae61f218495c24d510394c WatchSource:0}: Error finding container 8a05455c415022a546691b294aadfa9b141185ce0cae61f218495c24d510394c: Status 404 returned error can't find the container with id 8a05455c415022a546691b294aadfa9b141185ce0cae61f218495c24d510394c Apr 24 22:51:22.932114 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.932079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" event={"ID":"4e25e494-c495-44fd-b89c-009820b8b20c","Type":"ContainerStarted","Data":"f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3"} Apr 24 22:51:22.932114 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.932116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" event={"ID":"4e25e494-c495-44fd-b89c-009820b8b20c","Type":"ContainerStarted","Data":"558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0"} Apr 24 22:51:22.932348 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.932132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" event={"ID":"4e25e494-c495-44fd-b89c-009820b8b20c","Type":"ContainerStarted","Data":"8a05455c415022a546691b294aadfa9b141185ce0cae61f218495c24d510394c"} Apr 24 22:51:22.932348 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.932250 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:22.933694 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.933669 2575 generic.go:358] "Generic (PLEG): container finished" podID="675b0d26-e528-409f-951e-a218d9d2b936" containerID="ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44" exitCode=2 Apr 24 22:51:22.933779 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.933738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" event={"ID":"675b0d26-e528-409f-951e-a218d9d2b936","Type":"ContainerDied","Data":"ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44"} Apr 24 22:51:22.953909 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:22.953844 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podStartSLOduration=0.953820944 podStartE2EDuration="953.820944ms" podCreationTimestamp="2026-04-24 22:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:51:22.951629398 +0000 UTC m=+1288.873272661" watchObservedRunningTime="2026-04-24 22:51:22.953820944 +0000 UTC m=+1288.875464206" Apr 24 22:51:23.937509 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:23.937480 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:23.938937 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:23.938909 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:51:24.941190 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:24.941149 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:51:25.549455 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.549429 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:51:25.663294 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.663215 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/675b0d26-e528-409f-951e-a218d9d2b936-error-404-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"675b0d26-e528-409f-951e-a218d9d2b936\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " Apr 24 22:51:25.663434 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.663317 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls\") pod \"675b0d26-e528-409f-951e-a218d9d2b936\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " Apr 24 22:51:25.663434 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.663359 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqxp7\" (UniqueName: \"kubernetes.io/projected/675b0d26-e528-409f-951e-a218d9d2b936-kube-api-access-tqxp7\") pod \"675b0d26-e528-409f-951e-a218d9d2b936\" (UID: \"675b0d26-e528-409f-951e-a218d9d2b936\") " Apr 24 22:51:25.663627 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.663584 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/675b0d26-e528-409f-951e-a218d9d2b936-error-404-isvc-32c7c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-32c7c-kube-rbac-proxy-sar-config") pod "675b0d26-e528-409f-951e-a218d9d2b936" (UID: "675b0d26-e528-409f-951e-a218d9d2b936"). InnerVolumeSpecName "error-404-isvc-32c7c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:51:25.665281 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.665259 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "675b0d26-e528-409f-951e-a218d9d2b936" (UID: "675b0d26-e528-409f-951e-a218d9d2b936"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:51:25.665379 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.665362 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675b0d26-e528-409f-951e-a218d9d2b936-kube-api-access-tqxp7" (OuterVolumeSpecName: "kube-api-access-tqxp7") pod "675b0d26-e528-409f-951e-a218d9d2b936" (UID: "675b0d26-e528-409f-951e-a218d9d2b936"). InnerVolumeSpecName "kube-api-access-tqxp7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:51:25.764289 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.764251 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tqxp7\" (UniqueName: \"kubernetes.io/projected/675b0d26-e528-409f-951e-a218d9d2b936-kube-api-access-tqxp7\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:51:25.764289 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.764285 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/675b0d26-e528-409f-951e-a218d9d2b936-error-404-isvc-32c7c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:51:25.764289 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.764296 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/675b0d26-e528-409f-951e-a218d9d2b936-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 22:51:25.946118 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.946084 2575 generic.go:358] "Generic (PLEG): container finished" podID="675b0d26-e528-409f-951e-a218d9d2b936" containerID="4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945" exitCode=0 Apr 24 22:51:25.946554 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.946167 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" Apr 24 22:51:25.946554 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.946168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" event={"ID":"675b0d26-e528-409f-951e-a218d9d2b936","Type":"ContainerDied","Data":"4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945"} Apr 24 22:51:25.946554 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.946272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z" event={"ID":"675b0d26-e528-409f-951e-a218d9d2b936","Type":"ContainerDied","Data":"7413b6eb820c0a82ada5dc19cdd69fca39c40a9bab5caab4c1bed13e98adc41f"} Apr 24 22:51:25.946554 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.946288 2575 scope.go:117] "RemoveContainer" containerID="ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44" Apr 24 22:51:25.955210 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.955191 2575 scope.go:117] "RemoveContainer" containerID="4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945" Apr 24 22:51:25.962515 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.962495 2575 scope.go:117] "RemoveContainer" containerID="ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44" Apr 24 22:51:25.962783 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:51:25.962763 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44\": container with ID starting with ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44 not found: ID does not exist" containerID="ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44" Apr 24 22:51:25.962854 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.962792 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44"} err="failed to get container status \"ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44\": rpc error: code = NotFound desc = could not find container \"ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44\": container with ID starting with ec31f5a5b7ea405ff1ea1be220679e44a992f3a313d48134e264c9e93989ac44 not found: ID does not exist" Apr 24 22:51:25.962854 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.962811 2575 scope.go:117] "RemoveContainer" containerID="4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945" Apr 24 22:51:25.963070 ip-10-0-132-138 kubenswrapper[2575]: E0424 22:51:25.963049 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945\": container with ID starting with 4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945 not found: ID does not exist" containerID="4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945" Apr 24 22:51:25.963110 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.963076 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945"} err="failed to get container status \"4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945\": rpc error: code = NotFound desc = could not find container \"4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945\": container with ID starting with 4620fa7c411ffeb324a3a120d385644f27d4aae0820aaa5ddf957df7a728c945 not found: ID does not exist" Apr 24 22:51:25.968367 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.968345 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z"] Apr 24 22:51:25.972093 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:25.972071 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32c7c-predictor-69fdd96fd6-v6c7z"] Apr 24 22:51:26.567164 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:26.567127 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675b0d26-e528-409f-951e-a218d9d2b936" path="/var/lib/kubelet/pods/675b0d26-e528-409f-951e-a218d9d2b936/volumes" Apr 24 22:51:29.945280 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:29.945251 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:51:29.945802 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:29.945773 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:51:30.799064 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:30.799028 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 22:51:39.945835 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:39.945792 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:51:49.946263 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:49.946217 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:51:59.946323 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:51:59.946280 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:52:09.947285 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:52:09.947207 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 22:59:57.247861 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.247823 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw"] Apr 24 22:59:57.248376 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.248098 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" containerID="cri-o://467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227" gracePeriod=30 Apr 24 22:59:57.248376 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.248151 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kube-rbac-proxy" containerID="cri-o://739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808" gracePeriod=30 Apr 24 22:59:57.322798 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.322761 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n"] Apr 24 22:59:57.323187 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.323158 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kube-rbac-proxy" Apr 24 22:59:57.323187 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.323174 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kube-rbac-proxy" Apr 24 22:59:57.323358 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.323191 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" Apr 24 22:59:57.323358 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.323196 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" Apr 24 22:59:57.323358 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.323285 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kube-rbac-proxy" Apr 24 22:59:57.323358 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.323300 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="675b0d26-e528-409f-951e-a218d9d2b936" containerName="kserve-container" Apr 24 22:59:57.326579 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.326561 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.329085 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.329064 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-44b1e-predictor-serving-cert\"" Apr 24 22:59:57.329190 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.329083 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-44b1e-kube-rbac-proxy-sar-config\"" Apr 24 22:59:57.336159 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.336140 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n"] Apr 24 22:59:57.472716 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.472657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de53ce31-1453-4f66-9a58-bf6080a80a42-proxy-tls\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.472925 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.472753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de53ce31-1453-4f66-9a58-bf6080a80a42-error-404-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.472925 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.472821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqgs7\" (UniqueName: \"kubernetes.io/projected/de53ce31-1453-4f66-9a58-bf6080a80a42-kube-api-access-xqgs7\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.573935 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.573848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqgs7\" (UniqueName: \"kubernetes.io/projected/de53ce31-1453-4f66-9a58-bf6080a80a42-kube-api-access-xqgs7\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.573935 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.573902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de53ce31-1453-4f66-9a58-bf6080a80a42-proxy-tls\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.574151 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.573974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de53ce31-1453-4f66-9a58-bf6080a80a42-error-404-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.574678 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.574650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de53ce31-1453-4f66-9a58-bf6080a80a42-error-404-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.576377 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.576345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de53ce31-1453-4f66-9a58-bf6080a80a42-proxy-tls\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.584071 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.584044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqgs7\" (UniqueName: \"kubernetes.io/projected/de53ce31-1453-4f66-9a58-bf6080a80a42-kube-api-access-xqgs7\") pod \"error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.639673 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.639637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:57.716358 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.716325 2575 generic.go:358] "Generic (PLEG): container finished" podID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerID="739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808" exitCode=2 Apr 24 22:59:57.716524 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.716402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" event={"ID":"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21","Type":"ContainerDied","Data":"739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808"} Apr 24 22:59:57.765379 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.765355 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n"] Apr 24 22:59:57.767361 ip-10-0-132-138 kubenswrapper[2575]: W0424 22:59:57.767336 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde53ce31_1453_4f66_9a58_bf6080a80a42.slice/crio-749d0af180c55b8cacf6e5810904c7f8e86465d4ca23edf9e24d5fc9d79fc0ab WatchSource:0}: Error finding container 749d0af180c55b8cacf6e5810904c7f8e86465d4ca23edf9e24d5fc9d79fc0ab: Status 404 returned error can't find the container with id 749d0af180c55b8cacf6e5810904c7f8e86465d4ca23edf9e24d5fc9d79fc0ab Apr 24 22:59:57.769047 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:57.769033 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:59:58.722777 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:58.722742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" event={"ID":"de53ce31-1453-4f66-9a58-bf6080a80a42","Type":"ContainerStarted","Data":"365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da"} Apr 24 22:59:58.722777 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:58.722782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" event={"ID":"de53ce31-1453-4f66-9a58-bf6080a80a42","Type":"ContainerStarted","Data":"8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d"} Apr 24 22:59:58.723200 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:58.722793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" event={"ID":"de53ce31-1453-4f66-9a58-bf6080a80a42","Type":"ContainerStarted","Data":"749d0af180c55b8cacf6e5810904c7f8e86465d4ca23edf9e24d5fc9d79fc0ab"} Apr 24 22:59:58.723200 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:58.722907 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:58.741372 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:58.741314 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podStartSLOduration=1.741297624 podStartE2EDuration="1.741297624s" podCreationTimestamp="2026-04-24 22:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:59:58.738900716 +0000 UTC m=+1804.660543976" watchObservedRunningTime="2026-04-24 22:59:58.741297624 +0000 UTC m=+1804.662940885" Apr 24 22:59:59.732811 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:59.732776 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 22:59:59.734403 ip-10-0-132-138 kubenswrapper[2575]: I0424 22:59:59.734370 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 23:00:00.407397 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.407374 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 23:00:00.598291 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.598200 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls\") pod \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " Apr 24 23:00:00.598446 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.598317 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-error-404-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " Apr 24 23:00:00.598446 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.598374 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kw6j\" (UniqueName: \"kubernetes.io/projected/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-kube-api-access-9kw6j\") pod \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\" (UID: \"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21\") " Apr 24 23:00:00.598721 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.598691 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-error-404-isvc-4aba3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-4aba3-kube-rbac-proxy-sar-config") pod "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" (UID: "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21"). InnerVolumeSpecName "error-404-isvc-4aba3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:00:00.600479 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.600448 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" (UID: "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:00:00.600479 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.600457 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-kube-api-access-9kw6j" (OuterVolumeSpecName: "kube-api-access-9kw6j") pod "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" (UID: "5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21"). InnerVolumeSpecName "kube-api-access-9kw6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:00:00.699439 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.699385 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-error-404-isvc-4aba3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:00:00.699439 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.699432 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kw6j\" (UniqueName: \"kubernetes.io/projected/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-kube-api-access-9kw6j\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:00:00.699439 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.699444 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:00:00.737444 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.737405 2575 generic.go:358] "Generic (PLEG): container finished" podID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerID="467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227" exitCode=0 Apr 24 23:00:00.737876 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.737481 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" Apr 24 23:00:00.737876 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.737489 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" event={"ID":"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21","Type":"ContainerDied","Data":"467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227"} Apr 24 23:00:00.737876 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.737526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw" event={"ID":"5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21","Type":"ContainerDied","Data":"e7c73224d426a3451a693a93b0f677f5758e405620bc7b8dedb312c5d412e6f1"} Apr 24 23:00:00.737876 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.737541 2575 scope.go:117] "RemoveContainer" containerID="739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808" Apr 24 23:00:00.738097 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.738075 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 23:00:00.746438 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.746419 2575 scope.go:117] "RemoveContainer" containerID="467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227" Apr 24 23:00:00.754041 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.754023 2575 scope.go:117] "RemoveContainer" containerID="739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808" Apr 24 23:00:00.754307 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:00:00.754285 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808\": container with ID starting with 739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808 not found: ID does not exist" containerID="739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808" Apr 24 23:00:00.754362 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.754316 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808"} err="failed to get container status \"739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808\": rpc error: code = NotFound desc = could not find container \"739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808\": container with ID starting with 739932af785f0a03dd0bd15f8b67de52342e5154ef94002aa5c45539338f4808 not found: ID does not exist" Apr 24 23:00:00.754362 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.754336 2575 scope.go:117] "RemoveContainer" containerID="467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227" Apr 24 23:00:00.754575 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:00:00.754553 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227\": container with ID starting with 467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227 not found: ID does not exist" containerID="467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227" Apr 24 23:00:00.754729 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.754602 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227"} err="failed to get container status \"467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227\": rpc error: code = NotFound desc = could not find container \"467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227\": container with ID starting with 467839bb62a5fdacdab44c419422ba1c3d044507b4af8e2b49a05845d300a227 not found: ID does not exist" Apr 24 23:00:00.758438 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.758417 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw"] Apr 24 23:00:00.761826 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:00.761804 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4aba3-predictor-6f567c46fb-kdltw"] Apr 24 23:00:02.566036 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:02.565993 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" path="/var/lib/kubelet/pods/5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21/volumes" Apr 24 23:00:05.742447 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:05.742418 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 23:00:05.742872 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:05.742853 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 23:00:15.743058 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:15.743011 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 23:00:25.743014 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:25.742972 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 23:00:35.743192 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:35.743154 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 23:00:37.166162 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.166128 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755"] Apr 24 23:00:37.166659 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.166508 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" containerID="cri-o://558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0" gracePeriod=30 Apr 24 23:00:37.166659 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.166571 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kube-rbac-proxy" containerID="cri-o://f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3" gracePeriod=30 Apr 24 23:00:37.222047 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222012 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz"] Apr 24 23:00:37.222464 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222446 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kube-rbac-proxy" Apr 24 23:00:37.222552 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222466 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kube-rbac-proxy" Apr 24 23:00:37.222552 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222510 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" Apr 24 23:00:37.222552 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222521 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" Apr 24 23:00:37.222744 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222627 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kserve-container" Apr 24 23:00:37.222744 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.222644 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aed5c0b-de0f-4bf1-b1e7-bf379bb3ee21" containerName="kube-rbac-proxy" Apr 24 23:00:37.226217 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.226195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.228653 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.228629 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d0532-predictor-serving-cert\"" Apr 24 23:00:37.228762 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.228667 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d0532-kube-rbac-proxy-sar-config\"" Apr 24 23:00:37.232141 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.232118 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz"] Apr 24 23:00:37.312824 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.312786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74vwc\" (UniqueName: \"kubernetes.io/projected/11a841c7-77cf-462a-a9f8-553c22eb81d7-kube-api-access-74vwc\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.313029 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.312842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11a841c7-77cf-462a-a9f8-553c22eb81d7-error-404-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.313029 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.312895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11a841c7-77cf-462a-a9f8-553c22eb81d7-proxy-tls\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.413793 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.413756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74vwc\" (UniqueName: \"kubernetes.io/projected/11a841c7-77cf-462a-a9f8-553c22eb81d7-kube-api-access-74vwc\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.413961 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.413814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11a841c7-77cf-462a-a9f8-553c22eb81d7-error-404-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.413961 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.413841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11a841c7-77cf-462a-a9f8-553c22eb81d7-proxy-tls\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.414464 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.414434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11a841c7-77cf-462a-a9f8-553c22eb81d7-error-404-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.416280 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.416224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11a841c7-77cf-462a-a9f8-553c22eb81d7-proxy-tls\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.424695 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.424672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74vwc\" (UniqueName: \"kubernetes.io/projected/11a841c7-77cf-462a-a9f8-553c22eb81d7-kube-api-access-74vwc\") pod \"error-404-isvc-d0532-predictor-7bbf598d6b-xswpz\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.538731 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.538689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.662400 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.662364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz"] Apr 24 23:00:37.665680 ip-10-0-132-138 kubenswrapper[2575]: W0424 23:00:37.665646 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a841c7_77cf_462a_a9f8_553c22eb81d7.slice/crio-176ac028807e8aa5287028b1597ce8589b587bd775d516a0af36b814d7d31796 WatchSource:0}: Error finding container 176ac028807e8aa5287028b1597ce8589b587bd775d516a0af36b814d7d31796: Status 404 returned error can't find the container with id 176ac028807e8aa5287028b1597ce8589b587bd775d516a0af36b814d7d31796 Apr 24 23:00:37.864664 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.864611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" event={"ID":"11a841c7-77cf-462a-a9f8-553c22eb81d7","Type":"ContainerStarted","Data":"9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2"} Apr 24 23:00:37.864664 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.864664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" event={"ID":"11a841c7-77cf-462a-a9f8-553c22eb81d7","Type":"ContainerStarted","Data":"b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f"} Apr 24 23:00:37.864907 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.864681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" event={"ID":"11a841c7-77cf-462a-a9f8-553c22eb81d7","Type":"ContainerStarted","Data":"176ac028807e8aa5287028b1597ce8589b587bd775d516a0af36b814d7d31796"} Apr 24 23:00:37.864907 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.864701 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:37.866087 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.866058 2575 generic.go:358] "Generic (PLEG): container finished" podID="4e25e494-c495-44fd-b89c-009820b8b20c" containerID="f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3" exitCode=2 Apr 24 23:00:37.866193 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.866122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" event={"ID":"4e25e494-c495-44fd-b89c-009820b8b20c","Type":"ContainerDied","Data":"f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3"} Apr 24 23:00:37.883019 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:37.882974 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podStartSLOduration=0.882958829 podStartE2EDuration="882.958829ms" podCreationTimestamp="2026-04-24 23:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:00:37.881092622 +0000 UTC m=+1843.802735903" watchObservedRunningTime="2026-04-24 23:00:37.882958829 +0000 UTC m=+1843.804602089" Apr 24 23:00:38.869118 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:38.869092 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:38.870231 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:38.870202 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 23:00:39.871997 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:39.871950 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 23:00:39.941625 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:39.941566 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 24 23:00:39.946325 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:39.946283 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 23:00:40.310369 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.310341 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 23:00:40.441565 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.441526 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e25e494-c495-44fd-b89c-009820b8b20c-proxy-tls\") pod \"4e25e494-c495-44fd-b89c-009820b8b20c\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " Apr 24 23:00:40.441565 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.441569 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rqsx\" (UniqueName: \"kubernetes.io/projected/4e25e494-c495-44fd-b89c-009820b8b20c-kube-api-access-7rqsx\") pod \"4e25e494-c495-44fd-b89c-009820b8b20c\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " Apr 24 23:00:40.441826 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.441608 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e25e494-c495-44fd-b89c-009820b8b20c-error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"4e25e494-c495-44fd-b89c-009820b8b20c\" (UID: \"4e25e494-c495-44fd-b89c-009820b8b20c\") " Apr 24 23:00:40.442021 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.441996 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e25e494-c495-44fd-b89c-009820b8b20c-error-404-isvc-2e6fe-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-2e6fe-kube-rbac-proxy-sar-config") pod "4e25e494-c495-44fd-b89c-009820b8b20c" (UID: "4e25e494-c495-44fd-b89c-009820b8b20c"). InnerVolumeSpecName "error-404-isvc-2e6fe-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:00:40.443648 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.443629 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e25e494-c495-44fd-b89c-009820b8b20c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4e25e494-c495-44fd-b89c-009820b8b20c" (UID: "4e25e494-c495-44fd-b89c-009820b8b20c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:00:40.443786 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.443766 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e25e494-c495-44fd-b89c-009820b8b20c-kube-api-access-7rqsx" (OuterVolumeSpecName: "kube-api-access-7rqsx") pod "4e25e494-c495-44fd-b89c-009820b8b20c" (UID: "4e25e494-c495-44fd-b89c-009820b8b20c"). InnerVolumeSpecName "kube-api-access-7rqsx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:00:40.543120 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.543079 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e25e494-c495-44fd-b89c-009820b8b20c-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:00:40.543120 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.543112 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rqsx\" (UniqueName: \"kubernetes.io/projected/4e25e494-c495-44fd-b89c-009820b8b20c-kube-api-access-7rqsx\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:00:40.543120 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.543122 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e25e494-c495-44fd-b89c-009820b8b20c-error-404-isvc-2e6fe-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:00:40.876773 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.876675 2575 generic.go:358] "Generic (PLEG): container finished" podID="4e25e494-c495-44fd-b89c-009820b8b20c" containerID="558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0" exitCode=0 Apr 24 23:00:40.876773 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.876752 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" Apr 24 23:00:40.876773 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.876764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" event={"ID":"4e25e494-c495-44fd-b89c-009820b8b20c","Type":"ContainerDied","Data":"558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0"} Apr 24 23:00:40.877263 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.876805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755" event={"ID":"4e25e494-c495-44fd-b89c-009820b8b20c","Type":"ContainerDied","Data":"8a05455c415022a546691b294aadfa9b141185ce0cae61f218495c24d510394c"} Apr 24 23:00:40.877263 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.876821 2575 scope.go:117] "RemoveContainer" containerID="f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3" Apr 24 23:00:40.885614 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.885577 2575 scope.go:117] "RemoveContainer" containerID="558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0" Apr 24 23:00:40.894138 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.893812 2575 scope.go:117] "RemoveContainer" containerID="f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3" Apr 24 23:00:40.894255 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:00:40.894202 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3\": container with ID starting with f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3 not found: ID does not exist" containerID="f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3" Apr 24 23:00:40.894314 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.894246 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3"} err="failed to get container status \"f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3\": rpc error: code = NotFound desc = could not find container \"f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3\": container with ID starting with f7276775bf2cf649e315371a43a5f371c8f182a37ff83eb2e3de6981751cded3 not found: ID does not exist" Apr 24 23:00:40.894314 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.894272 2575 scope.go:117] "RemoveContainer" containerID="558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0" Apr 24 23:00:40.894548 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:00:40.894520 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0\": container with ID starting with 558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0 not found: ID does not exist" containerID="558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0" Apr 24 23:00:40.894736 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.894660 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0"} err="failed to get container status \"558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0\": rpc error: code = NotFound desc = could not find container \"558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0\": container with ID starting with 558304d359551c866eba8fc1edcbc57e0d11e5908f6e13272962845ed61a1fc0 not found: ID does not exist" Apr 24 23:00:40.900950 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.900917 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755"] Apr 24 23:00:40.901339 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:40.901316 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2e6fe-predictor-89f79697d-42755"] Apr 24 23:00:42.566887 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:42.566855 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" path="/var/lib/kubelet/pods/4e25e494-c495-44fd-b89c-009820b8b20c/volumes" Apr 24 23:00:44.876980 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:44.876948 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:00:44.877417 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:44.877356 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 23:00:45.743629 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:45.743576 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 23:00:54.877954 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:00:54.877917 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 23:01:04.877449 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:04.877408 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 23:01:07.512534 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.512496 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n"] Apr 24 23:01:07.513072 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.512887 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" containerID="cri-o://8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d" gracePeriod=30 Apr 24 23:01:07.513670 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.513148 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kube-rbac-proxy" containerID="cri-o://365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da" gracePeriod=30 Apr 24 23:01:07.666156 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666121 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq"] Apr 24 23:01:07.666516 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666504 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kube-rbac-proxy" Apr 24 23:01:07.666570 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666517 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kube-rbac-proxy" Apr 24 23:01:07.666570 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666528 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" Apr 24 23:01:07.666570 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666534 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" Apr 24 23:01:07.666687 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666607 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kserve-container" Apr 24 23:01:07.666687 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.666616 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e25e494-c495-44fd-b89c-009820b8b20c" containerName="kube-rbac-proxy" Apr 24 23:01:07.669573 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.669552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.672015 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.671990 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-be7a5-kube-rbac-proxy-sar-config\"" Apr 24 23:01:07.672146 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.672023 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-be7a5-predictor-serving-cert\"" Apr 24 23:01:07.678708 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.678301 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq"] Apr 24 23:01:07.760024 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.759964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.760227 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.760056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvml\" (UniqueName: \"kubernetes.io/projected/76d12ff9-7ae6-4279-a316-130ac44adeca-kube-api-access-jjvml\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.760227 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.760103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76d12ff9-7ae6-4279-a316-130ac44adeca-error-404-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.861471 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.861368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76d12ff9-7ae6-4279-a316-130ac44adeca-error-404-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.861665 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.861469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.861665 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.861537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvml\" (UniqueName: \"kubernetes.io/projected/76d12ff9-7ae6-4279-a316-130ac44adeca-kube-api-access-jjvml\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.861665 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:01:07.861648 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-serving-cert: secret "error-404-isvc-be7a5-predictor-serving-cert" not found Apr 24 23:01:07.861889 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:01:07.861722 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls podName:76d12ff9-7ae6-4279-a316-130ac44adeca nodeName:}" failed. No retries permitted until 2026-04-24 23:01:08.361701111 +0000 UTC m=+1874.283344358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls") pod "error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" (UID: "76d12ff9-7ae6-4279-a316-130ac44adeca") : secret "error-404-isvc-be7a5-predictor-serving-cert" not found Apr 24 23:01:07.862008 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.861988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76d12ff9-7ae6-4279-a316-130ac44adeca-error-404-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.871971 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.871944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvml\" (UniqueName: \"kubernetes.io/projected/76d12ff9-7ae6-4279-a316-130ac44adeca-kube-api-access-jjvml\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:07.971467 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.971432 2575 generic.go:358] "Generic (PLEG): container finished" podID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerID="365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da" exitCode=2 Apr 24 23:01:07.971645 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:07.971504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" event={"ID":"de53ce31-1453-4f66-9a58-bf6080a80a42","Type":"ContainerDied","Data":"365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da"} Apr 24 23:01:08.365308 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.365275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:08.367744 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.367712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls\") pod \"error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:08.583429 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.583391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:08.714938 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.714914 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq"] Apr 24 23:01:08.717231 ip-10-0-132-138 kubenswrapper[2575]: W0424 23:01:08.717198 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d12ff9_7ae6_4279_a316_130ac44adeca.slice/crio-c94010d24ae4e85eccac7cb600ae53e4c885c31406ddc452d182cf6d17b9e479 WatchSource:0}: Error finding container c94010d24ae4e85eccac7cb600ae53e4c885c31406ddc452d182cf6d17b9e479: Status 404 returned error can't find the container with id c94010d24ae4e85eccac7cb600ae53e4c885c31406ddc452d182cf6d17b9e479 Apr 24 23:01:08.982522 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.982466 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" event={"ID":"76d12ff9-7ae6-4279-a316-130ac44adeca","Type":"ContainerStarted","Data":"309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba"} Apr 24 23:01:08.982708 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.982534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" event={"ID":"76d12ff9-7ae6-4279-a316-130ac44adeca","Type":"ContainerStarted","Data":"2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37"} Apr 24 23:01:08.982708 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:08.982550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" event={"ID":"76d12ff9-7ae6-4279-a316-130ac44adeca","Type":"ContainerStarted","Data":"c94010d24ae4e85eccac7cb600ae53e4c885c31406ddc452d182cf6d17b9e479"} Apr 24 23:01:09.002043 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:09.001993 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podStartSLOduration=2.001978182 podStartE2EDuration="2.001978182s" podCreationTimestamp="2026-04-24 23:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:01:08.999933216 +0000 UTC m=+1874.921576502" watchObservedRunningTime="2026-04-24 23:01:09.001978182 +0000 UTC m=+1874.923621454" Apr 24 23:01:09.985509 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:09.985475 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:09.985982 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:09.985519 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:09.986998 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:09.986971 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 23:01:10.739103 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:10.739061 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 24 23:01:10.989622 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:10.989513 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 23:01:11.163173 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.163151 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 23:01:11.188762 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.188730 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de53ce31-1453-4f66-9a58-bf6080a80a42-proxy-tls\") pod \"de53ce31-1453-4f66-9a58-bf6080a80a42\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " Apr 24 23:01:11.188910 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.188830 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de53ce31-1453-4f66-9a58-bf6080a80a42-error-404-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"de53ce31-1453-4f66-9a58-bf6080a80a42\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " Apr 24 23:01:11.188910 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.188891 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqgs7\" (UniqueName: \"kubernetes.io/projected/de53ce31-1453-4f66-9a58-bf6080a80a42-kube-api-access-xqgs7\") pod \"de53ce31-1453-4f66-9a58-bf6080a80a42\" (UID: \"de53ce31-1453-4f66-9a58-bf6080a80a42\") " Apr 24 23:01:11.189193 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.189154 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de53ce31-1453-4f66-9a58-bf6080a80a42-error-404-isvc-44b1e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-44b1e-kube-rbac-proxy-sar-config") pod "de53ce31-1453-4f66-9a58-bf6080a80a42" (UID: "de53ce31-1453-4f66-9a58-bf6080a80a42"). InnerVolumeSpecName "error-404-isvc-44b1e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:01:11.190895 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.190873 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de53ce31-1453-4f66-9a58-bf6080a80a42-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "de53ce31-1453-4f66-9a58-bf6080a80a42" (UID: "de53ce31-1453-4f66-9a58-bf6080a80a42"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:01:11.190973 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.190891 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de53ce31-1453-4f66-9a58-bf6080a80a42-kube-api-access-xqgs7" (OuterVolumeSpecName: "kube-api-access-xqgs7") pod "de53ce31-1453-4f66-9a58-bf6080a80a42" (UID: "de53ce31-1453-4f66-9a58-bf6080a80a42"). InnerVolumeSpecName "kube-api-access-xqgs7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:01:11.290307 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.290221 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de53ce31-1453-4f66-9a58-bf6080a80a42-error-404-isvc-44b1e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:01:11.290307 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.290259 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqgs7\" (UniqueName: \"kubernetes.io/projected/de53ce31-1453-4f66-9a58-bf6080a80a42-kube-api-access-xqgs7\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:01:11.290307 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.290269 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de53ce31-1453-4f66-9a58-bf6080a80a42-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:01:11.994441 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.994409 2575 generic.go:358] "Generic (PLEG): container finished" podID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerID="8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d" exitCode=0 Apr 24 23:01:11.994942 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.994447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" event={"ID":"de53ce31-1453-4f66-9a58-bf6080a80a42","Type":"ContainerDied","Data":"8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d"} Apr 24 23:01:11.994942 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.994469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" event={"ID":"de53ce31-1453-4f66-9a58-bf6080a80a42","Type":"ContainerDied","Data":"749d0af180c55b8cacf6e5810904c7f8e86465d4ca23edf9e24d5fc9d79fc0ab"} Apr 24 23:01:11.994942 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.994486 2575 scope.go:117] "RemoveContainer" containerID="365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da" Apr 24 23:01:11.994942 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:11.994486 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n" Apr 24 23:01:12.003635 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.003612 2575 scope.go:117] "RemoveContainer" containerID="8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d" Apr 24 23:01:12.011111 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.011093 2575 scope.go:117] "RemoveContainer" containerID="365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da" Apr 24 23:01:12.011340 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:01:12.011323 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da\": container with ID starting with 365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da not found: ID does not exist" containerID="365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da" Apr 24 23:01:12.011392 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.011349 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da"} err="failed to get container status \"365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da\": rpc error: code = NotFound desc = could not find container \"365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da\": container with ID starting with 365520c3ee4f1ac2845c0c0584fb293232c6db3c4f6b385051673d8ebfa1d7da not found: ID does not exist" Apr 24 23:01:12.011392 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.011365 2575 scope.go:117] "RemoveContainer" containerID="8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d" Apr 24 23:01:12.011554 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:01:12.011540 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d\": container with ID starting with 8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d not found: ID does not exist" containerID="8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d" Apr 24 23:01:12.011604 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.011558 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d"} err="failed to get container status \"8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d\": rpc error: code = NotFound desc = could not find container \"8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d\": container with ID starting with 8f09556aa40732a34b55529e241eb6c0fed694a69a3945f383b717cc6fb6226d not found: ID does not exist" Apr 24 23:01:12.015896 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.015875 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n"] Apr 24 23:01:12.019129 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.019108 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44b1e-predictor-648b8cfbf8-mdq6n"] Apr 24 23:01:12.566514 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:12.566469 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" path="/var/lib/kubelet/pods/de53ce31-1453-4f66-9a58-bf6080a80a42/volumes" Apr 24 23:01:14.878327 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:14.878287 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 23:01:15.993875 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:15.993846 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:01:15.994404 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:15.994378 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 23:01:24.878677 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:24.878644 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:01:25.994580 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:25.994541 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 23:01:35.994703 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:35.994661 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 23:01:45.994855 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:45.994806 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 23:01:55.994780 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:01:55.994747 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:10:22.378940 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:22.378863 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq"] Apr 24 23:10:22.381688 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:22.379142 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" containerID="cri-o://2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37" gracePeriod=30 Apr 24 23:10:22.381688 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:22.379229 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kube-rbac-proxy" containerID="cri-o://309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba" gracePeriod=30 Apr 24 23:10:22.893494 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:22.893460 2575 generic.go:358] "Generic (PLEG): container finished" podID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerID="309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba" exitCode=2 Apr 24 23:10:22.893695 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:22.893542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" event={"ID":"76d12ff9-7ae6-4279-a316-130ac44adeca","Type":"ContainerDied","Data":"309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba"} Apr 24 23:10:25.522867 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.522842 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:10:25.646082 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.646003 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76d12ff9-7ae6-4279-a316-130ac44adeca-error-404-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"76d12ff9-7ae6-4279-a316-130ac44adeca\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " Apr 24 23:10:25.646213 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.646100 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls\") pod \"76d12ff9-7ae6-4279-a316-130ac44adeca\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " Apr 24 23:10:25.646213 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.646157 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjvml\" (UniqueName: \"kubernetes.io/projected/76d12ff9-7ae6-4279-a316-130ac44adeca-kube-api-access-jjvml\") pod \"76d12ff9-7ae6-4279-a316-130ac44adeca\" (UID: \"76d12ff9-7ae6-4279-a316-130ac44adeca\") " Apr 24 23:10:25.646370 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.646343 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d12ff9-7ae6-4279-a316-130ac44adeca-error-404-isvc-be7a5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-be7a5-kube-rbac-proxy-sar-config") pod "76d12ff9-7ae6-4279-a316-130ac44adeca" (UID: "76d12ff9-7ae6-4279-a316-130ac44adeca"). InnerVolumeSpecName "error-404-isvc-be7a5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:10:25.646556 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.646526 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76d12ff9-7ae6-4279-a316-130ac44adeca-error-404-isvc-be7a5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:10:25.648288 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.648263 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "76d12ff9-7ae6-4279-a316-130ac44adeca" (UID: "76d12ff9-7ae6-4279-a316-130ac44adeca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:10:25.648379 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.648312 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d12ff9-7ae6-4279-a316-130ac44adeca-kube-api-access-jjvml" (OuterVolumeSpecName: "kube-api-access-jjvml") pod "76d12ff9-7ae6-4279-a316-130ac44adeca" (UID: "76d12ff9-7ae6-4279-a316-130ac44adeca"). InnerVolumeSpecName "kube-api-access-jjvml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:10:25.747718 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.747668 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76d12ff9-7ae6-4279-a316-130ac44adeca-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:10:25.747718 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.747711 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jjvml\" (UniqueName: \"kubernetes.io/projected/76d12ff9-7ae6-4279-a316-130ac44adeca-kube-api-access-jjvml\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:10:25.905253 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.905168 2575 generic.go:358] "Generic (PLEG): container finished" podID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerID="2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37" exitCode=0 Apr 24 23:10:25.905253 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.905204 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" event={"ID":"76d12ff9-7ae6-4279-a316-130ac44adeca","Type":"ContainerDied","Data":"2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37"} Apr 24 23:10:25.905253 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.905229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" event={"ID":"76d12ff9-7ae6-4279-a316-130ac44adeca","Type":"ContainerDied","Data":"c94010d24ae4e85eccac7cb600ae53e4c885c31406ddc452d182cf6d17b9e479"} Apr 24 23:10:25.905253 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.905244 2575 scope.go:117] "RemoveContainer" containerID="309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba" Apr 24 23:10:25.905492 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.905257 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq" Apr 24 23:10:25.913648 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.913629 2575 scope.go:117] "RemoveContainer" containerID="2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37" Apr 24 23:10:25.920818 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.920791 2575 scope.go:117] "RemoveContainer" containerID="309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba" Apr 24 23:10:25.921065 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:10:25.921045 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba\": container with ID starting with 309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba not found: ID does not exist" containerID="309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba" Apr 24 23:10:25.921115 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.921074 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba"} err="failed to get container status \"309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba\": rpc error: code = NotFound desc = could not find container \"309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba\": container with ID starting with 309b066b20edf2e151b32990e04385078892ae67c392c7796786777e3d4296ba not found: ID does not exist" Apr 24 23:10:25.921115 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.921094 2575 scope.go:117] "RemoveContainer" containerID="2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37" Apr 24 23:10:25.921303 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:10:25.921284 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37\": container with ID starting with 2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37 not found: ID does not exist" containerID="2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37" Apr 24 23:10:25.921351 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.921312 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37"} err="failed to get container status \"2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37\": rpc error: code = NotFound desc = could not find container \"2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37\": container with ID starting with 2892335f202a0484cd6150a7c63385e63e9d40a7cc7464a49f9dd1fce4e11f37 not found: ID does not exist" Apr 24 23:10:25.926353 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.926332 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq"] Apr 24 23:10:25.930241 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:25.930215 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be7a5-predictor-5c7766ddbb-8xswq"] Apr 24 23:10:26.565818 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:10:26.565784 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" path="/var/lib/kubelet/pods/76d12ff9-7ae6-4279-a316-130ac44adeca/volumes" Apr 24 23:17:56.849366 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:56.849280 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz"] Apr 24 23:17:56.851918 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:56.849639 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" containerID="cri-o://b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f" gracePeriod=30 Apr 24 23:17:56.851918 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:56.849707 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kube-rbac-proxy" containerID="cri-o://9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2" gracePeriod=30 Apr 24 23:17:57.415961 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:57.415928 2575 generic.go:358] "Generic (PLEG): container finished" podID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerID="9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2" exitCode=2 Apr 24 23:17:57.416132 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:57.415995 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" event={"ID":"11a841c7-77cf-462a-a9f8-553c22eb81d7","Type":"ContainerDied","Data":"9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2"} Apr 24 23:17:59.901276 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.901253 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:17:59.988848 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.988808 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74vwc\" (UniqueName: \"kubernetes.io/projected/11a841c7-77cf-462a-a9f8-553c22eb81d7-kube-api-access-74vwc\") pod \"11a841c7-77cf-462a-a9f8-553c22eb81d7\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " Apr 24 23:17:59.988848 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.988847 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11a841c7-77cf-462a-a9f8-553c22eb81d7-proxy-tls\") pod \"11a841c7-77cf-462a-a9f8-553c22eb81d7\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " Apr 24 23:17:59.989102 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.988907 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11a841c7-77cf-462a-a9f8-553c22eb81d7-error-404-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"11a841c7-77cf-462a-a9f8-553c22eb81d7\" (UID: \"11a841c7-77cf-462a-a9f8-553c22eb81d7\") " Apr 24 23:17:59.989316 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.989291 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a841c7-77cf-462a-a9f8-553c22eb81d7-error-404-isvc-d0532-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d0532-kube-rbac-proxy-sar-config") pod "11a841c7-77cf-462a-a9f8-553c22eb81d7" (UID: "11a841c7-77cf-462a-a9f8-553c22eb81d7"). InnerVolumeSpecName "error-404-isvc-d0532-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:17:59.991035 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.991012 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a841c7-77cf-462a-a9f8-553c22eb81d7-kube-api-access-74vwc" (OuterVolumeSpecName: "kube-api-access-74vwc") pod "11a841c7-77cf-462a-a9f8-553c22eb81d7" (UID: "11a841c7-77cf-462a-a9f8-553c22eb81d7"). InnerVolumeSpecName "kube-api-access-74vwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:17:59.991114 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:17:59.991019 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a841c7-77cf-462a-a9f8-553c22eb81d7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "11a841c7-77cf-462a-a9f8-553c22eb81d7" (UID: "11a841c7-77cf-462a-a9f8-553c22eb81d7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:18:00.090131 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.090022 2575 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11a841c7-77cf-462a-a9f8-553c22eb81d7-error-404-isvc-d0532-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:18:00.090131 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.090065 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-74vwc\" (UniqueName: \"kubernetes.io/projected/11a841c7-77cf-462a-a9f8-553c22eb81d7-kube-api-access-74vwc\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:18:00.090131 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.090080 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11a841c7-77cf-462a-a9f8-553c22eb81d7-proxy-tls\") on node \"ip-10-0-132-138.ec2.internal\" DevicePath \"\"" Apr 24 23:18:00.427862 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.427771 2575 generic.go:358] "Generic (PLEG): container finished" podID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerID="b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f" exitCode=0 Apr 24 23:18:00.427862 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.427852 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" Apr 24 23:18:00.427862 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.427851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" event={"ID":"11a841c7-77cf-462a-a9f8-553c22eb81d7","Type":"ContainerDied","Data":"b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f"} Apr 24 23:18:00.428132 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.427889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" event={"ID":"11a841c7-77cf-462a-a9f8-553c22eb81d7","Type":"ContainerDied","Data":"176ac028807e8aa5287028b1597ce8589b587bd775d516a0af36b814d7d31796"} Apr 24 23:18:00.428132 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.427903 2575 scope.go:117] "RemoveContainer" containerID="9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2" Apr 24 23:18:00.436475 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.436454 2575 scope.go:117] "RemoveContainer" containerID="b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f" Apr 24 23:18:00.443727 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.443706 2575 scope.go:117] "RemoveContainer" containerID="9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2" Apr 24 23:18:00.443983 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:18:00.443964 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2\": container with ID starting with 9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2 not found: ID does not exist" containerID="9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2" Apr 24 23:18:00.444031 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.443993 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2"} err="failed to get container status \"9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2\": rpc error: code = NotFound desc = could not find container \"9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2\": container with ID starting with 9f20e732da8b6600164c7afb6837bff3817e36628d375dca3bc3f16a71bbcff2 not found: ID does not exist" Apr 24 23:18:00.444031 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.444012 2575 scope.go:117] "RemoveContainer" containerID="b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f" Apr 24 23:18:00.444247 ip-10-0-132-138 kubenswrapper[2575]: E0424 23:18:00.444226 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f\": container with ID starting with b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f not found: ID does not exist" containerID="b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f" Apr 24 23:18:00.444303 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.444254 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f"} err="failed to get container status \"b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f\": rpc error: code = NotFound desc = could not find container \"b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f\": container with ID starting with b55f1a1814421c8ee4ee56fb4374e6456ed6839c1fb864c80b38d18915d1653f not found: ID does not exist" Apr 24 23:18:00.449521 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.449497 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz"] Apr 24 23:18:00.452631 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.452603 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz"] Apr 24 23:18:00.566161 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.566128 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" path="/var/lib/kubelet/pods/11a841c7-77cf-462a-a9f8-553c22eb81d7/volumes" Apr 24 23:18:00.873093 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:00.873045 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d0532-predictor-7bbf598d6b-xswpz" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 24 23:18:25.447979 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:25.447948 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7nh6n_b64af9f1-9022-4deb-8138-e644bc894d82/global-pull-secret-syncer/0.log" Apr 24 23:18:25.633882 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:25.633850 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tw4b7_78fc0fe0-05b7-43dc-a67f-00b59f3eaca9/konnectivity-agent/0.log" Apr 24 23:18:25.655334 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:25.655307 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-138.ec2.internal_55b514acdb5c12be2a393c574a525294/haproxy/0.log" Apr 24 23:18:29.192456 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.192426 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/alertmanager/0.log" Apr 24 23:18:29.218742 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.218712 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/config-reloader/0.log" Apr 24 23:18:29.243164 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.243135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/kube-rbac-proxy-web/0.log" Apr 24 23:18:29.267413 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.267378 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/kube-rbac-proxy/0.log" Apr 24 23:18:29.286130 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.286100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/kube-rbac-proxy-metric/0.log" Apr 24 23:18:29.304270 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.304249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/prom-label-proxy/0.log" Apr 24 23:18:29.324096 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.324066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9912da88-6cdd-473c-855f-ae7c8dc4302a/init-config-reloader/0.log" Apr 24 23:18:29.388453 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.388411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-cnh9j_544ed2db-d8bb-44fd-824b-848a3cc34ab7/cluster-monitoring-operator/0.log" Apr 24 23:18:29.410575 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.410542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srwcn_7dc08650-dc67-4bbf-adc1-df7bb1a6d15b/kube-state-metrics/0.log" Apr 24 23:18:29.430600 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.430552 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srwcn_7dc08650-dc67-4bbf-adc1-df7bb1a6d15b/kube-rbac-proxy-main/0.log" Apr 24 23:18:29.452761 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.452678 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srwcn_7dc08650-dc67-4bbf-adc1-df7bb1a6d15b/kube-rbac-proxy-self/0.log" Apr 24 23:18:29.541349 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.541318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cbb7c_fb1c64b1-1c7e-4825-9e30-821260908c4b/node-exporter/0.log" Apr 24 23:18:29.558138 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.558112 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cbb7c_fb1c64b1-1c7e-4825-9e30-821260908c4b/kube-rbac-proxy/0.log" Apr 24 23:18:29.576541 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:29.576518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cbb7c_fb1c64b1-1c7e-4825-9e30-821260908c4b/init-textfile/0.log" Apr 24 23:18:30.066840 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.066801 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-q9w7l_d9944c42-61bb-48d9-8504-dcf1430a7af5/prometheus-operator-admission-webhook/0.log" Apr 24 23:18:30.170521 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.170487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd8874bb9-cd8jx_c870b053-b8ff-47de-8360-f05275ea8f7b/thanos-query/0.log" Apr 24 23:18:30.190095 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.190065 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd8874bb9-cd8jx_c870b053-b8ff-47de-8360-f05275ea8f7b/kube-rbac-proxy-web/0.log" Apr 24 23:18:30.210238 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.210214 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd8874bb9-cd8jx_c870b053-b8ff-47de-8360-f05275ea8f7b/kube-rbac-proxy/0.log" Apr 24 23:18:30.230633 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.230607 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd8874bb9-cd8jx_c870b053-b8ff-47de-8360-f05275ea8f7b/prom-label-proxy/0.log" Apr 24 23:18:30.249398 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.249373 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd8874bb9-cd8jx_c870b053-b8ff-47de-8360-f05275ea8f7b/kube-rbac-proxy-rules/0.log" Apr 24 23:18:30.274574 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:30.274542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd8874bb9-cd8jx_c870b053-b8ff-47de-8360-f05275ea8f7b/kube-rbac-proxy-metrics/0.log" Apr 24 23:18:32.750192 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:32.750159 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-vx8c6_6ab9f5f8-559e-4176-b778-01a1a51317a4/volume-data-source-validator/0.log" Apr 24 23:18:33.003465 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003390 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5"] Apr 24 23:18:33.003803 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003790 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kube-rbac-proxy" Apr 24 23:18:33.003850 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003807 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kube-rbac-proxy" Apr 24 23:18:33.003850 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003822 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kube-rbac-proxy" Apr 24 23:18:33.003850 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003828 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kube-rbac-proxy" Apr 24 23:18:33.003850 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003845 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" Apr 24 23:18:33.003850 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003851 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003857 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kube-rbac-proxy" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003863 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kube-rbac-proxy" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003870 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003875 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003885 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003893 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003942 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kube-rbac-proxy" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003950 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kube-rbac-proxy" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003958 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="de53ce31-1453-4f66-9a58-bf6080a80a42" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003968 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kube-rbac-proxy" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003978 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="11a841c7-77cf-462a-a9f8-553c22eb81d7" containerName="kserve-container" Apr 24 23:18:33.004001 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.003987 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="76d12ff9-7ae6-4279-a316-130ac44adeca" containerName="kserve-container" Apr 24 23:18:33.007084 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.007067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.009701 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.009674 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jpd6k\"/\"openshift-service-ca.crt\"" Apr 24 23:18:33.009701 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.009698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jpd6k\"/\"kube-root-ca.crt\"" Apr 24 23:18:33.010738 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.010713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jpd6k\"/\"default-dockercfg-fs845\"" Apr 24 23:18:33.012427 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.012406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5"] Apr 24 23:18:33.064220 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.064185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-sys\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.064391 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.064229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-podres\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.064391 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.064254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-proc\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.064391 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.064373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vclpw\" (UniqueName: \"kubernetes.io/projected/b3c3ac75-2486-45f3-9403-2d9141cfd31f-kube-api-access-vclpw\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.064495 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.064418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-lib-modules\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165235 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vclpw\" (UniqueName: \"kubernetes.io/projected/b3c3ac75-2486-45f3-9403-2d9141cfd31f-kube-api-access-vclpw\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165418 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-lib-modules\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165418 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-sys\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165418 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-podres\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165418 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-proc\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165418 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-lib-modules\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165649 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-sys\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165649 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-proc\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.165649 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.165456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b3c3ac75-2486-45f3-9403-2d9141cfd31f-podres\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.173650 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.173629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vclpw\" (UniqueName: \"kubernetes.io/projected/b3c3ac75-2486-45f3-9403-2d9141cfd31f-kube-api-access-vclpw\") pod \"perf-node-gather-daemonset-thpm5\" (UID: \"b3c3ac75-2486-45f3-9403-2d9141cfd31f\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.318456 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.318350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:33.437306 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.437280 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5"] Apr 24 23:18:33.439260 ip-10-0-132-138 kubenswrapper[2575]: W0424 23:18:33.439230 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb3c3ac75_2486_45f3_9403_2d9141cfd31f.slice/crio-5ffac9095f5fe804f8e06e8fe1da032187d0c3a826217e312a7eb5404a271ef3 WatchSource:0}: Error finding container 5ffac9095f5fe804f8e06e8fe1da032187d0c3a826217e312a7eb5404a271ef3: Status 404 returned error can't find the container with id 5ffac9095f5fe804f8e06e8fe1da032187d0c3a826217e312a7eb5404a271ef3 Apr 24 23:18:33.440807 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.440785 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:18:33.472856 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.472819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9ljzs_de7eead8-356f-4ed5-a05e-ca346be1cd7c/dns/0.log" Apr 24 23:18:33.491231 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.491208 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9ljzs_de7eead8-356f-4ed5-a05e-ca346be1cd7c/kube-rbac-proxy/0.log" Apr 24 23:18:33.543537 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.543504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" event={"ID":"b3c3ac75-2486-45f3-9403-2d9141cfd31f","Type":"ContainerStarted","Data":"5ffac9095f5fe804f8e06e8fe1da032187d0c3a826217e312a7eb5404a271ef3"} Apr 24 23:18:33.600259 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:33.600175 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-72jb2_b489f9e3-3cc2-43d8-9554-bfa2f6c7aa6f/dns-node-resolver/0.log" Apr 24 23:18:34.033884 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:34.033855 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-78549dfc96-psndh_b1881c44-a3c0-45f9-955d-c9d358c76849/registry/0.log" Apr 24 23:18:34.078157 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:34.078122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ltpdf_993dc3a3-4c4d-4d45-92a5-a952464091dc/node-ca/0.log" Apr 24 23:18:34.547663 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:34.547627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" event={"ID":"b3c3ac75-2486-45f3-9403-2d9141cfd31f","Type":"ContainerStarted","Data":"60e7ba84a6b1b85a49875adca56a638decda059b4e01720a9d604862373ec018"} Apr 24 23:18:34.547847 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:34.547715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:34.564634 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:34.564568 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" podStartSLOduration=2.5645495069999997 podStartE2EDuration="2.564549507s" podCreationTimestamp="2026-04-24 23:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:18:34.563515405 +0000 UTC m=+2920.485158665" watchObservedRunningTime="2026-04-24 23:18:34.564549507 +0000 UTC m=+2920.486192772" Apr 24 23:18:34.810692 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:34.810607 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-86466888b4-mw72x_4f41768b-12df-4ac4-ab57-68ec0bada16d/router/0.log" Apr 24 23:18:35.147543 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:35.147460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rmzcn_234a1a93-e9ac-4d93-9c1a-57d82a34f0fb/serve-healthcheck-canary/0.log" Apr 24 23:18:35.537832 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:35.537799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l227c_9b9c0d29-345d-4837-b76a-3bb9fd546efa/kube-rbac-proxy/0.log" Apr 24 23:18:35.555392 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:35.555364 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l227c_9b9c0d29-345d-4837-b76a-3bb9fd546efa/exporter/0.log" Apr 24 23:18:35.573013 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:35.572984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l227c_9b9c0d29-345d-4837-b76a-3bb9fd546efa/extractor/0.log" Apr 24 23:18:37.719402 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:37.719370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-549bc44c6d-tcgps_10346e36-5134-4c43-9cdc-868be1c90c55/manager/0.log" Apr 24 23:18:37.756840 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:37.756805 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-fv66f_2cfe8849-ad32-4aff-a904-f77a727d6b37/server/0.log" Apr 24 23:18:38.041371 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:38.041290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-dr49k_d20746c0-08ea-4a8c-8c3b-d47523a2cb9b/seaweedfs/0.log" Apr 24 23:18:40.565987 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:40.565950 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-thpm5" Apr 24 23:18:42.114582 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:42.114485 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t2xzx_0783a162-f638-447b-b28a-38a88c620edb/kube-storage-version-migrator-operator/1.log" Apr 24 23:18:42.116038 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:42.116010 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t2xzx_0783a162-f638-447b-b28a-38a88c620edb/kube-storage-version-migrator-operator/0.log" Apr 24 23:18:43.087748 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.087711 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57wkd_62d9f82c-64e8-47dd-9c00-4a979c247925/kube-multus/0.log" Apr 24 23:18:43.287079 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.287050 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/kube-multus-additional-cni-plugins/0.log" Apr 24 23:18:43.307314 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.307255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/egress-router-binary-copy/0.log" Apr 24 23:18:43.326541 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.326513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/cni-plugins/0.log" Apr 24 23:18:43.351847 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.351813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/bond-cni-plugin/0.log" Apr 24 23:18:43.371214 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.371188 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/routeoverride-cni/0.log" Apr 24 23:18:43.389505 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.389476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/whereabouts-cni-bincopy/0.log" Apr 24 23:18:43.408697 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.408668 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4nz6_50425312-0cb9-4942-aa9c-d32f6f8ba0f6/whereabouts-cni/0.log" Apr 24 23:18:43.797131 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.797101 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6h7k6_4d5279a2-c42c-42b0-a00f-df176466bd90/network-metrics-daemon/0.log" Apr 24 23:18:43.812213 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:43.812187 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6h7k6_4d5279a2-c42c-42b0-a00f-df176466bd90/kube-rbac-proxy/0.log" Apr 24 23:18:44.639816 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.639783 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/ovn-controller/0.log" Apr 24 23:18:44.679673 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.679645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/ovn-acl-logging/0.log" Apr 24 23:18:44.700860 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.700833 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/kube-rbac-proxy-node/0.log" Apr 24 23:18:44.721029 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.720998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 23:18:44.735528 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.735496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/northd/0.log" Apr 24 23:18:44.753084 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.753062 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/nbdb/0.log" Apr 24 23:18:44.770342 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.770313 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/sbdb/0.log" Apr 24 23:18:44.924198 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:44.924108 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9blws_cde4b0d1-afe8-471e-9274-67dea8902733/ovnkube-controller/0.log" Apr 24 23:18:46.429018 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:46.428990 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-fc9xv_bfcbbed3-11b9-4b33-aa07-519bf4877cdd/check-endpoints/0.log" Apr 24 23:18:46.449722 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:46.449689 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kb7rl_b85211f3-5059-45ad-85fd-0c5901095d1e/network-check-target-container/0.log" Apr 24 23:18:47.388793 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:47.388763 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-bfb6l_b4a4f470-707a-47cd-a98e-5cc998b168bc/iptables-alerter/0.log" Apr 24 23:18:48.053463 ip-10-0-132-138 kubenswrapper[2575]: I0424 23:18:48.053432 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2gfr9_8d48e8cb-0a88-4aa5-aa23-7d4f745d1c88/tuned/0.log"